Sorry wrong output indeed, here is the correct one:

$ bin/find-hive-dependency.sh


Logging initialized using configuration in
file:/etc/hive/conf.dist/hive-log4j.properties

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in
[jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in
[jar:file:/opt/edw/hive/auxlib/hive-udfs.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

hive dependency:
/etc/hive/conf:/usr/lib/hive/lib/ant-1.9.1.jar:/usr/lib/hive/lib/hive-common-0.13.0.2.1.7.0-784.jar:/usr/lib/hive/lib/hive-service.jar:/usr/lib/hive/lib/hive-shims.jar:/usr/lib/hive/lib/hive-shims-0.13.0.2.1.7.0-784.jar:/usr/lib/hive/lib/optiq-avatica-0.5.jar:/usr/lib/hive/lib/servlet-api-2.5.jar:/usr/lib/hive/lib/paranamer-2.3.jar:/usr/lib/hive/lib/commons-pool-1.5.4.jar:/usr/lib/hive/lib/activation-1.1.jar:/usr/lib/hive/lib/hive-shims-common-secure.jar:/usr/lib/hive/lib/datanucleus-core-3.2.10.jar:/usr/lib/hive/lib/hive-beeline.jar:/usr/lib/hive/lib/hamcrest-core-1.1.jar:/usr/lib/hive/lib/jetty-6.1.26.jar:/usr/lib/hive/lib/jline-0.9.94.jar:/usr/lib/hive/lib/commons-httpclient-3.0.1.jar:/usr/lib/hive/lib/hive-metastore.jar:/usr/lib/hive/lib/avro-1.7.5.jar:/usr/lib/hive/lib/hive-common.jar:/usr/lib/hive/lib/guava-11.0.2.jar:/usr/lib/hive/lib/log4j-1.2.16.jar:/usr/lib/hive/lib/commons-compress-1.4.1.jar:/usr/lib/hive/lib/ant-launcher-1.9.1.jar:/usr/lib/hive/lib/hive-shims-common-0.13.0.2.1.7.0-784.jar:/usr/lib/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/lib/hive/lib/hive-cli.jar:/usr/lib/hive/lib/commons-codec-1.4.jar:/usr/lib/hive/lib/stax-api-1.0.1.jar:/usr/lib/hive/lib/hive-service-0.13.0.2.1.7.0-784.jar:/usr/lib/hive/lib/hive-testutils.jar:/usr/lib/hive/lib/oro-2.0.8.jar:/usr/lib/hive/lib/optiq-core-0.5.jar:/usr/lib/hive/lib/hive-shims-0.23-0.13.0.2.1.7.0-784.jar:/usr/lib/hive/lib/hive-cli-0.13.0.2.1.7.0-784.jar:/usr/lib/hive/lib/hive-beeline-0.13.0.2.1.7.0-784.jar:/usr/lib/hive/lib/commons-logging-1.1.3.jar:/usr/lib/hive/lib/hive-exec-0.13.0.2.1.7.0-784.jar:/usr/lib/hive/lib/hive-jdbc-0.13.0.2.1.7.0-784.jar:/usr/lib/hive/lib/hive-contrib-0.13.0.2.1.7.0-784.jar:/usr/lib/hive/lib/libthrift-0.9.0.jar:/usr/lib/hive/lib/linq4j-0.1.13.jar:/usr/lib/hive/lib/hive-custom-hooks-0.0.1-SNAPSHOT.jar:/usr/lib/hive/lib/jdo-api-3.0.1.jar:/usr/lib/hive/lib/hive-shims-0.20S-0.13.0.2.1.7.0-784.jar:/usr/lib/hive/lib/commons-collections-3.1.jar:/usr/lib/hive/lib/hive-shims-common.jar:/usr/lib/hive/lib/hive-hwi.jar:/usr/lib/hive/lib/xz-1.0.jar:/usr/lib/hive/lib/jetty-util-6.1.26.jar:/usr/lib/hive/lib/hive-metastore-0.13.0.2.1.7.0-784.jar:/usr/lib/hive/lib/hive-jdbc.jar:/usr/lib/hive/lib/servlet-api-2.5-20081211.jar:/usr/lib/hive/lib/commons-compiler-2.7.3.jar:/usr/lib/hive/lib/tempus-fugit-1.1.jar:/usr/lib/hive/lib/httpcore-4.2.5.jar:/usr/lib/hive/lib/hive-serde.jar:/usr/lib/hive/lib/hive-shims-0.20-0.13.0.2.1.7.0-784.jar:/usr/lib/hive/lib/mail-1.4.1.jar:/usr/lib/hive/lib/janino-2.7.3.jar:/usr/lib/hive/lib/hive-ant-0.13.0.2.1.7.0-784.jar:/usr/lib/hive/lib/asm-tree-3.1.jar:/usr/lib/hive/lib/eigenbase-properties-1.1.4.jar:/usr/lib/hive/lib/datanucleus-api-jdo-3.2.6.jar:/usr/lib/hive/lib/derby-10.10.1.1.jar:/usr/lib/hive/lib/commons-cli-1.2.jar:/usr/lib/hive/lib/stringtemplate-3.2.1.jar:/usr/lib/hive/lib/hive-hbase-handler-0.13.0.2.1.7.0-784.jar:/usr/lib/hive/lib/jta-1.1.jar:/usr/lib/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/usr/lib/hive/lib/libfb303-0.9.0.jar:/usr/lib/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/usr/lib/hive/lib/hive-hbase-handler.jar:/usr/lib/hive/lib/commons-lang-2.4.jar:/usr/lib/hive/lib/commons-dbcp-1.4.jar:/usr/lib/hive/lib/hive-ant.jar:/usr/lib/hive/lib/hive-testutils-0.13.0.2.1.7.0-784.jar:/usr/lib/hive/lib/derbynet-10.10.1.1.jar:/usr/lib/hive/lib/httpclient-4.2.5.jar:/usr/lib/hive/lib/antlr-2.7.7.jar:/usr/lib/hive/lib/hive-hwi-0.13.0.2.1.7.0-784.jar:/usr/lib/hive/lib/junit-4.10.jar:/usr/lib/hive/lib/derbyclient-10.10.1.1.jar:/usr/lib/hive/lib/hive-exec.jar:/usr/lib/hive/lib/datanucleus-rdbms-3.2.9.jar:/usr/lib/hive/lib/groovy-all-2.1.6.jar:/usr/lib/hive/lib/snappy-java-1.0.5.jar:/usr/lib/hive/lib/antlr-runtime-3.4.jar:/usr/lib/hive/lib/hive-contrib.jar:/usr/lib/hive/lib/ST4-4.0.4.jar:/usr/lib/hive/lib/asm-commons-3.1.jar:/usr/lib/hive/lib/jetty-all-7.6.0.v20120127.jar:/usr/lib/hive/lib/jpam-1.1.jar:/usr/lib/hive/lib/eigenbase-xom-1.3.4.jar:/usr/lib/hive/lib/velocity-1.5.jar:/usr/lib/hive/lib/hive-shims-common-secure-0.13.0.2.1.7.0-784.jar:/usr/lib/hive/lib/zookeeper-3.4.5.2.1.7.0-784.jar:/usr/lib/hive/lib/hive-serde-0.13.0.2.1.7.0-784.jar:/usr/lib/hive/lib/jsr305-1.3.9.jar:/usr/lib/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/usr/lib/hive/lib/commons-io-2.4.jar:/usr/lib/hive-hcatalog/share/hcatalog/hive-hcatalog-core-0.13.0.2.1.7.0-784.jar

hive conf: /etc/hive/conf

On Fri, Jun 5, 2015 at 11:48 AM, Shi, Shaofeng <[email protected]> wrote:

> Not check-env.sh, try find-hive-dependency.sh; it will output something
> like "hive dependency: xxxx”;
>
> On 6/5/15, 6:34 PM, "alex schufo" <[email protected]> wrote:
>
> >I checked that but there isn't any dependency error, see below the output:
> >
> >$ bin/check-env.sh
> >
> >KYLIN_HOME is set to /home/username/kylin/kylin-0.7.1-incubating-SNAPSHOT
> >
> >On Fri, Jun 5, 2015 at 11:16 AM, Shi, Shaofeng <[email protected]> wrote:
> >
> >> Please run $KYLIN_HOME/bin/find-hive-dependency.sh and then copy the
> >> output here; Kylin need add the Hcatalog to MR class path, if the
> >>Hcatalog
> >> jar wasn¹t found, the "HCatInputFormat not found² error will be
> >>reported;
> >>
> >> On 6/5/15, 4:58 PM, "alex schufo" <[email protected]> wrote:
> >>
> >> >Hi all,
> >> >
> >> >I built a cube on a Hive fact table of about 300 millions rows, all the
> >> >steps are FINISHED with success but it seems I cannot do any query.
> >> >
> >> >select * from myFactTable; (or any other query)
> >> >
> >> >returns From line 1, column 15 to line 1, column 43: Table
> >>'MYFACTTABLE'
> >> >not found while executing SQL: "select * from myFactTable LIMIT 50000"
> >> >
> >> >I tried rebuilding completely the cube after a purge, created a new
> >> >project, etc. but it doesn't solve the issue.
> >> >
> >> >It seems like a new KYLIN_ table has been created and contains data.
> >>How
> >> >can I know which HBase table corresponds to which cube?
> >> >
> >> >The "Convert Cuboid Data to HFile" step took a while, the MapReduce job
> >> >stayed in Undefined status for a long time but finally Succeeded. When
> >>I
> >> >click on the link from the Kylin UI it send me to 7070/kylin/N/A
> >>instead
> >> >of
> >> >the MapReduce job info, is that a bad sign?
> >> >
> >> >In the logs I don't see any error in the Build N-Dimension Cuboid Data,
> >> >Calculate HTable Region Splits, Create HTable
> >> >
> >> >I am not sure if it's event related but in the logs can this:
> >> >
> >> >2015-06-04 10:17:10,910 INFO  [pool-7-thread-10] mapreduce.Job:  map 1%
> >> >reduce 0%
> >> >
> >> >2015-06-04 10:17:11,936 INFO  [pool-7-thread-10] mapreduce.Job: Task
> >>Id :
> >> >attempt_1432057232815_49758_m_000003_2, Status : FAILED
> >> >
> >> >Error: java.lang.RuntimeException: java.lang.ClassNotFoundException:
> >>Class
> >> >org.apache.hive.hcatalog.mapreduce.HCatInputFormat not found
> >> >
> >> >        at
> >> >org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1961)
> >> >
> >> >        at
> >>
> >>>org.apache.hadoop.mapreduce.task.JobContextImpl.getInputFormatClass(JobC
> >>>on
> >> >textImpl.java:174)
> >> >
> >> >        at
> >>org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:726)
> >> >
> >> >        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
> >> >
> >> >        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> >> >
> >> >        at java.security.AccessController.doPrivileged(Native Method)
> >> >
> >> >        at javax.security.auth.Subject.doAs(Subject.java:415)
> >> >
> >> >        at
> >>
> >>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformatio
> >>>n.
> >> >java:1594)
> >> >
> >> >        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
> >> >
> >> >Caused by: java.lang.ClassNotFoundException: Class
> >> >org.apache.hive.hcatalog.mapreduce.HCatInputFormat not found
> >> >
> >> >        at
> >>
> >>>org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1
> >>>86
> >> >7)
> >> >
> >> >        at
> >> >org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1959)
> >> >
> >> >        ... 8 more
> >> >
> >> >
> >> >
> >> >but then it seems to go on anyway:
> >> >
> >> >2015-06-04 10:17:15,127 INFO  [pool-7-thread-10] mapreduce.Job:  map 2%
> >> >reduce 0%
> >> >
> >> >2015-06-04 10:17:18,174 INFO  [pool-7-thread-10] mapreduce.Job:  map 3%
> >> >reduce 0%
> >> >
> >> >2015-06-04 10:17:20,202 INFO  [pool-7-thread-10] mapreduce.Job:  map 4%
> >> >reduce 0%
> >> >
> >> >2015-06-04 10:17:22,230 INFO  [pool-7-thread-10] mapreduce.Job:  map 5%
> >> >reduce 0%
> >> >
> >> >2015-06-04 10:17:23,244 INFO  [pool-7-thread-10] mapreduce.Job:  map 6%
> >> >reduce 0%
> >> >
> >> >...
> >> >
> >> >2015-06-04 10:19:33,458 INFO  [pool-7-thread-10] mapreduce.Job:  map
> >>100%
> >> >reduce 100%
> >> >
> >> >2015-06-04 10:19:33,481 INFO  [pool-7-thread-10] mapreduce.Job: Job
> >> >job_1432057232815_49758 completed successfully
> >> >
> >> >
> >> >This comes after
> >> >
> >> >
> >> >Hive Column Cardinality calculation for table
> >> >
> >> >and a step called :
> >> >
> >> >
> >>
> >>>+-----------------------------------------------------------------------
> >>>--
> >> >-----------------------------+
> >> >
> >> >| null
> >> >                            |
> >> >
> >>
> >>>+-----------------------------------------------------------------------
> >>>--
> >> >-----------------------------+
> >> >
> >> >
> >> >Thanks for your help, I would really like to make this PoC work!
> >>
> >>
>
>

Reply via email to