I think there is a mix of classes of hadoop-common getting on to the classpath. This might possibly be due to hive-exec.jar.
One option for you to try is put the tez-minimal tarball on HDFS instead of the full tarball and set “tez.use.cluster.hadoop-libs” to true in your tez-site.xml. This will use hadoop jars from the cluster and hopefully solve the mixed version problem. thanks — Hitesh On Feb 12, 2016, at 1:01 AM, no jihun <jees...@gmail.com> wrote: > * I did nothing with hive-exec.jar, hive.jar.directory.. > > * hive-default.xml > > $ cat eco/hive/conf/hive-default.xml > > <property> > <name>hive.jar.directory</name> > <value/> > <description> > This is the location hive in tez mode will look for to find a site wide > installed hive instance. > </description> > </property> > > > * tez/lib have hadoop-common-2.6.0.jar > * tez tarball on hdfs also exactly same jar list > $ pwd > /home1/apps/tez-0.8.2/lib > $ ll > -rw-rw-r-- 1 124846 2016-02-11 13:37 RoaringBitmap-0.4.9.jar > -rw-rw-r-- 1 62983 2016-02-11 13:37 activation-1.1.jar > -rw-rw-r-- 1 4467 2016-02-11 13:37 aopalliance-1.0.jar > -rw-rw-r-- 1 44925 2016-02-11 13:37 apacheds-i18n-2.0.0-M15.jar > -rw-rw-r-- 1 691479 2016-02-11 13:37 apacheds-kerberos-codec-2.0.0-M15.jar > -rw-rw-r-- 1 16560 2016-02-11 13:37 api-asn1-api-1.0.0-M20.jar > -rw-rw-r-- 1 79912 2016-02-11 13:37 api-util-1.0.0-M20.jar > -rw-rw-r-- 1 535731 2016-02-11 13:37 async-http-client-1.8.16.jar > -rw-rw-r-- 1 303139 2016-02-11 13:37 avro-1.7.4.jar > -rw-rw-r-- 1 188671 2016-02-11 13:37 commons-beanutils-1.7.0.jar > -rw-rw-r-- 1 206035 2016-02-11 13:37 commons-beanutils-core-1.8.0.jar > -rw-rw-r-- 1 41123 2016-02-11 13:37 commons-cli-1.2.jar > -rw-rw-r-- 1 58160 2016-02-11 13:37 commons-codec-1.4.jar > -rw-rw-r-- 1 588337 2016-02-11 13:37 commons-collections-3.2.2.jar > -rw-rw-r-- 1 751238 2016-02-11 13:37 commons-collections4-4.1.jar > -rw-rw-r-- 1 241367 2016-02-11 13:37 commons-compress-1.4.1.jar > -rw-rw-r-- 1 298829 2016-02-11 13:37 commons-configuration-1.6.jar > -rw-rw-r-- 1 143602 2016-02-11 13:37 commons-digester-1.8.jar > -rw-rw-r-- 1 305001 2016-02-11 13:37 commons-httpclient-3.1.jar > -rw-rw-r-- 1 185140 2016-02-11 13:37 commons-io-2.4.jar > -rw-rw-r-- 1 284220 2016-02-11 13:37 commons-lang-2.6.jar > -rw-rw-r-- 1 62050 2016-02-11 13:37 commons-logging-1.1.3.jar > -rw-rw-r-- 1 1599627 2016-02-11 13:37 commons-math3-3.1.1.jar > -rw-rw-r-- 1 273370 2016-02-11 13:37 commons-net-3.1.jar > -rw-rw-r-- 1 68866 2016-02-11 13:37 curator-client-2.6.0.jar > -rw-rw-r-- 1 185245 2016-02-11 13:37 curator-framework-2.6.0.jar > -rw-rw-r-- 1 248171 2016-02-11 13:37 curator-recipes-2.6.0.jar > -rw-rw-r-- 1 190432 2016-02-11 13:37 gson-2.2.4.jar > -rw-rw-r-- 1 1648200 2016-02-11 13:37 guava-11.0.2.jar > -rw-rw-r-- 1 710492 2016-02-11 13:37 guice-3.0.jar > -rw-rw-r-- 1 65012 2016-02-11 13:37 guice-servlet-3.0.jar > -rw-rw-r-- 1 17035 2016-02-11 13:37 hadoop-annotations-2.6.0.jar > -rw-rw-r-- 1 67167 2016-02-11 13:37 hadoop-auth-2.6.0.jar > -rw-rw-r-- 1 3360985 2016-02-11 13:37 hadoop-common-2.6.0.jar > -rw-rw-r-- 1 7822670 2016-02-11 13:37 hadoop-hdfs-2.6.0.jar > -rw-rw-r-- 1 664918 2016-02-11 13:37 > hadoop-mapreduce-client-common-2.6.0.jar > -rw-rw-r-- 1 1509399 2016-02-11 13:37 hadoop-mapreduce-client-core-2.6.0.jar > -rw-rw-r-- 1 1870176 2016-02-11 13:37 hadoop-yarn-api-2.6.0.jar > -rw-rw-r-- 1 127986 2016-02-11 13:37 hadoop-yarn-client-2.6.0.jar > -rw-rw-r-- 1 1602059 2016-02-11 13:37 hadoop-yarn-common-2.6.0.jar > -rw-rw-r-- 1 289619 2016-02-11 13:37 hadoop-yarn-server-common-2.6.0.jar > -rw-rw-r-- 1 28501 2016-02-11 13:37 hadoop-yarn-server-web-proxy-2.6.0.jar > -rw-rw-r-- 1 31212 2016-02-11 13:37 htrace-core-3.0.4.jar > -rw-rw-r-- 1 433368 2016-02-11 13:37 httpclient-4.2.5.jar > -rw-rw-r-- 1 227275 2016-02-11 13:37 httpcore-4.2.4.jar > -rw-rw-r-- 1 232248 2016-02-11 13:37 jackson-core-asl-1.9.13.jar > -rw-rw-r-- 1 18336 2016-02-11 13:37 jackson-jaxrs-1.9.13.jar > -rw-rw-r-- 1 780664 2016-02-11 13:37 jackson-mapper-asl-1.9.13.jar > -rw-rw-r-- 1 27084 2016-02-11 13:37 jackson-xc-1.9.13.jar > -rw-rw-r-- 1 2497 2016-02-11 13:37 javax.inject-1.jar > -rw-rw-r-- 1 105134 2016-02-11 13:37 jaxb-api-2.2.2.jar > -rw-rw-r-- 1 890168 2016-02-11 13:37 jaxb-impl-2.2.3-1.jar > -rw-rw-r-- 1 130458 2016-02-11 13:37 jersey-client-1.9.jar > -rw-rw-r-- 1 458739 2016-02-11 13:37 jersey-core-1.9.jar > -rw-rw-r-- 1 14786 2016-02-11 13:37 jersey-guice-1.9.jar > -rw-rw-r-- 1 147952 2016-02-11 13:37 jersey-json-1.9.jar > -rw-rw-r-- 1 81743 2016-02-11 13:37 jettison-1.3.4.jar > -rw-rw-r-- 1 539912 2016-02-11 13:37 jetty-6.1.26.jar > -rw-rw-r-- 1 177131 2016-02-11 13:37 jetty-util-6.1.26.jar > -rw-rw-r-- 1 33031 2016-02-11 13:37 jsr305-3.0.0.jar > -rw-rw-r-- 1 1045744 2016-02-11 13:37 leveldbjni-all-1.8.jar > -rw-rw-r-- 1 489884 2016-02-11 13:37 log4j-1.2.17.jar > -rw-rw-r-- 1 111908 2016-02-11 13:37 metrics-core-3.1.0.jar > -rw-rw-r-- 1 1303237 2016-02-11 13:37 netty-3.9.2.Final.jar > -rw-rw-r-- 1 29555 2016-02-11 13:37 paranamer-2.3.jar > -rw-rw-r-- 1 533455 2016-02-11 13:37 protobuf-java-2.5.0.jar > -rw-rw-r-- 1 134133 2016-02-11 13:37 servlet-api-2.5-20081211.jar > -rw-rw-r-- 1 105112 2016-02-11 13:37 servlet-api-2.5.jar > -rw-rw-r-- 1 32119 2016-02-11 13:37 slf4j-api-1.7.10.jar > -rw-rw-r-- 1 8866 2016-02-11 13:37 slf4j-log4j12-1.7.10.jar > -rw-rw-r-- 1 995968 2016-02-11 13:37 snappy-java-1.0.4.1.jar > -rw-rw-r-- 1 23346 2016-02-11 13:37 stax-api-1.0-2.jar > -rw-rw-r-- 1 26514 2016-02-11 13:37 stax-api-1.0.1.jar > -rw-rw-r-- 1 1229125 2016-02-11 13:37 xercesImpl-2.9.1.jar > -rw-rw-r-- 1 194354 2016-02-11 13:37 xml-apis-1.3.04.jar > -rw-rw-r-- 1 15010 2016-02-11 13:37 xmlenc-0.52.jar > -rw-rw-r-- 1 94672 2016-02-11 13:37 xz-1.0.jar > -rw-rw-r-- 1 792964 2016-02-11 13:37 zookeeper-3.4.6.jar > > > * I am not sure hive compiled against hadoop-2.6.0 (someone else set up hive > and hadoop) > but where was no problem with hive on m/r. > this is hive/lib > -rw-rw-r-- 1 236660 2014-01-30 07:09 ST4-4.0.4.jar > -rw-rw-r-- 1 4368200 2015-04-30 03:26 accumulo-core-1.6.0.jar > -rw-rw-r-- 1 102069 2015-04-30 03:26 accumulo-fate-1.6.0.jar > -rw-rw-r-- 1 57420 2015-04-30 03:26 accumulo-start-1.6.0.jar > -rw-rw-r-- 1 117409 2015-04-30 03:26 accumulo-trace-1.6.0.jar > -rw-rw-r-- 1 62983 2014-01-30 07:08 activation-1.1.jar > -rw-rw-r-- 1 1997485 2014-01-30 07:07 ant-1.9.1.jar > -rw-rw-r-- 1 18336 2014-01-30 07:07 ant-launcher-1.9.1.jar > -rw-rw-r-- 1 445288 2014-01-30 07:09 antlr-2.7.7.jar > -rw-rw-r-- 1 164368 2014-01-30 07:09 antlr-runtime-3.4.jar > -rw-rw-r-- 1 30359 2015-04-30 03:08 apache-curator-2.6.0.pom > -rw-rw-r-- 1 448794 2015-04-30 03:08 apache-log4j-extras-1.2.17.jar > -rw-rw-r-- 1 32693 2014-01-30 11:30 asm-commons-3.1.jar > -rw-rw-r-- 1 21879 2014-01-30 11:30 asm-tree-3.1.jar > -rw-rw-r-- 1 400680 2014-01-30 07:08 avro-1.7.5.jar > -rw-rw-r-- 1 110600 2014-01-30 07:09 bonecp-0.8.0.RELEASE.jar > -rw-rw-r-- 1 258370 2015-04-30 03:25 > calcite-avatica-1.2.0-incubating.jar > -rw-rw-r-- 1 3519262 2015-04-30 03:25 calcite-core-1.2.0-incubating.jar > -rw-rw-r-- 1 442406 2015-04-30 03:25 calcite-linq4j-1.2.0-incubating.jar > -rw-rw-r-- 1 188671 2014-01-30 07:08 commons-beanutils-1.7.0.jar > -rw-rw-r-- 1 206035 2014-01-30 07:08 commons-beanutils-core-1.8.0.jar > -rw-rw-r-- 1 41123 2014-01-30 07:07 commons-cli-1.2.jar > -rw-rw-r-- 1 58160 2014-01-30 07:07 commons-codec-1.4.jar > -rw-rw-r-- 1 575389 2014-01-30 07:08 commons-collections-3.2.1.jar > -rw-rw-r-- 1 30595 2015-04-30 03:25 commons-compiler-2.7.6.jar > -rw-rw-r-- 1 241367 2014-01-30 07:08 commons-compress-1.4.1.jar > -rw-rw-r-- 1 298829 2014-01-30 07:08 commons-configuration-1.6.jar > -rw-rw-r-- 1 160519 2014-05-16 06:37 commons-dbcp-1.4.jar > -rw-rw-r-- 1 143602 2014-01-30 07:08 commons-digester-1.8.jar > -rw-rw-r-- 1 279781 2014-01-30 07:07 commons-httpclient-3.0.1.jar > -rw-rw-r-- 1 185140 2014-01-30 07:08 commons-io-2.4.jar > -rw-rw-r-- 1 284220 2014-01-30 11:31 commons-lang-2.6.jar > -rw-rw-r-- 1 62050 2014-01-30 07:07 commons-logging-1.1.3.jar > -rw-rw-r-- 1 832410 2014-01-30 07:08 commons-math-2.1.jar > -rw-rw-r-- 1 96221 2014-05-01 09:38 commons-pool-1.5.4.jar > -rw-rw-r-- 1 415578 2015-04-30 03:26 commons-vfs2-2.0.jar > -rw-rw-r-- 1 68866 2015-04-30 03:08 curator-client-2.6.0.jar > -rw-rw-r-- 1 185245 2015-04-30 03:08 curator-framework-2.6.0.jar > -rw-rw-r-- 1 248171 2015-04-30 03:23 curator-recipes-2.6.0.jar > -rw-rw-r-- 1 339666 2014-05-13 09:24 datanucleus-api-jdo-3.2.6.jar > -rw-rw-r-- 1 1890075 2014-05-13 09:24 datanucleus-core-3.2.10.jar > -rw-rw-r-- 1 1809447 2014-05-13 09:24 datanucleus-rdbms-3.2.9.jar > -rw-rw-r-- 1 2838580 2015-06-19 18:04 derby-10.10.2.0.jar > -rw-rw-r-- 1 18482 2015-04-30 03:25 eigenbase-properties-1.1.5.jar > -rw-rw-r-- 1 12452 2014-01-30 11:30 > geronimo-annotation_1.0_spec-1.1.1.jar > -rw-rw-r-- 1 30548 2014-01-30 11:30 geronimo-jaspic_1.0_spec-1.0.jar > -rw-rw-r-- 1 16030 2014-01-30 11:30 geronimo-jta_1.1_spec-1.1.1.jar > -rw-rw-r-- 1 6377448 2014-01-30 07:10 groovy-all-2.1.6.jar > -rw-rw-r-- 1 2189117 2015-04-30 03:08 guava-14.0.1.jar > -rw-rw-r-- 1 76643 2014-01-30 07:07 hamcrest-core-1.1.jar > -rw-rw-r-- 1 121403 2015-06-19 18:05 hive-accumulo-handler-1.2.1.jar > -rw-rw-r-- 1 47713 2015-06-19 18:04 hive-ant-1.2.1.jar > -rw-rw-r-- 1 138361 2015-06-19 18:05 hive-beeline-1.2.1.jar > -rw-rw-r-- 1 39019 2015-06-19 18:05 hive-cli-1.2.1.jar > -rw-rw-r-- 1 292290 2015-06-19 18:03 hive-common-1.2.1.jar > -rw-rw-r-- 1 121668 2015-06-19 18:05 hive-contrib-1.2.1.jar > -rw-rw-r-- 1 20599030 2015-06-19 18:04 hive-exec-1.2.1.jar > -rw-rw-r-- 1 115935 2015-06-19 18:05 hive-hbase-handler-1.2.1.jar > -rw-rw-r-- 1 28091 2015-06-19 18:06 hive-hwi-1.2.1.jar > -rw-rw-r-- 1 17360142 2015-06-19 18:05 hive-jdbc-1.2.1-standalone.jar > -rw-rw-r-- 1 100580 2015-06-19 18:05 hive-jdbc-1.2.1.jar > -rw-rw-r-- 1 5505100 2015-06-19 18:04 hive-metastore-1.2.1.jar > -rw-rw-r-- 1 916706 2015-06-19 18:03 hive-serde-1.2.1.jar > -rw-rw-r-- 1 1878543 2015-06-19 18:04 hive-service-1.2.1.jar > -rw-rw-r-- 1 32390 2015-06-19 18:03 hive-shims-0.20S-1.2.1.jar > -rw-rw-r-- 1 60070 2015-06-19 18:03 hive-shims-0.23-1.2.1.jar > -rw-rw-r-- 1 8949 2015-06-19 18:03 hive-shims-1.2.1.jar > -rw-rw-r-- 1 108914 2015-06-19 18:03 hive-shims-common-1.2.1.jar > -rw-rw-r-- 1 13065 2015-06-19 18:03 hive-shims-scheduler-1.2.1.jar > -rw-rw-r-- 1 14530 2015-06-19 18:06 hive-testutils-1.2.1.jar > -rw-rw-r-- 1 719304 2015-04-30 03:08 httpclient-4.4.jar > -rw-rw-r-- 1 321639 2015-04-30 03:08 httpcore-4.4.jar > -rw-rw-r-- 1 1282424 2015-04-30 03:25 ivy-2.4.0.jar > -rw-rw-r-- 1 611863 2015-04-30 03:25 janino-2.7.6.jar > -rw-rw-r-- 1 60527 2015-04-30 03:26 jcommander-1.32.jar > -rw-rw-r-- 1 201124 2014-01-30 07:09 jdo-api-3.0.1.jar > -rw-rw-r-- 1 1681148 2014-05-13 09:25 jetty-all-7.6.0.v20120127.jar > -rw-rw-r-- 1 1683027 2014-01-30 11:30 > jetty-all-server-7.6.0.v20120127.jar > -rw-rw-r-- 1 213854 2015-04-30 03:08 jline-2.12.jar > -rw-rw-r-- 1 588001 2015-04-30 03:23 joda-time-2.5.jar > -rw-rw-r-- 1 12131 2014-05-13 09:25 jpam-1.1.jar > -rw-rw-r-- 1 45944 2014-01-30 07:10 json-20090211.jar > -rw-rw-r-- 1 33031 2015-04-30 03:23 jsr305-3.0.0.jar > -rw-rw-r-- 1 15071 2014-01-30 07:09 jta-1.1.jar > -rw-rw-r-- 1 245039 2015-04-30 03:08 junit-4.11.jar > -rw-rw-r-- 1 313686 2015-04-30 03:23 libfb303-0.9.2.jar > -rw-rw-r-- 1 227712 2015-04-30 03:08 libthrift-0.9.2.jar > -rw-rw-r-- 1 481535 2014-01-30 07:06 log4j-1.2.16.jar > -rw-rw-r-- 1 447676 2014-01-30 11:30 mail-1.4.1.jar > -rw-rw-r-- 1 94421 2015-04-30 03:26 maven-scm-api-1.4.jar > -rw-rw-r-- 1 40066 2015-04-30 03:26 > maven-scm-provider-svn-commons-1.4.jar > -rw-rw-r-- 1 69858 2015-04-30 03:26 maven-scm-provider-svnexe-1.4.jar > -rw-rw-r-- 1 1208356 2015-04-30 03:08 netty-3.7.0.Final.jar > -rw-rw-r-- 1 19827 2015-04-30 03:23 opencsv-2.3.jar > -rw-rw-r-- 1 65261 2014-01-30 07:07 oro-2.0.8.jar > -rw-rw-r-- 1 29555 2014-01-30 07:08 paranamer-2.3.jar > -rw-rw-r-- 1 2796935 2015-04-30 03:23 parquet-hadoop-bundle-1.6.0.jar > -rw-rw-r-- 1 48557 2015-04-30 03:25 > pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar > drwxrwxr-x 6 4096 2015-09-10 20:14 php > -rw-rw-r-- 1 250546 2014-01-30 11:29 plexus-utils-1.5.6.jar > drwxrwxr-x 10 4096 2015-09-10 20:14 py > -rw-rw-r-- 1 25429 2015-04-30 03:26 regexp-1.3.jar > -rw-rw-r-- 1 105112 2014-01-30 07:08 servlet-api-2.5.jar > -rw-rw-r-- 1 1251514 2014-01-30 07:08 snappy-java-1.0.5.jar > -rw-r--r-- 1 162976273 2015-09-10 20:16 > spark-assembly-1.4.1-hadoop2.6.0.jar > -rw-rw-r-- 1 26514 2014-01-30 07:08 stax-api-1.0.1.jar > -rw-rw-r-- 1 148627 2014-01-30 07:09 stringtemplate-3.2.1.jar > -rw-rw-r-- 1 93210 2015-04-30 03:26 super-csv-2.2.0.jar > -rw-rw-r-- 1 55953 2014-01-30 11:30 tempus-fugit-1.1.jar > -rw-rw-r-- 1 392124 2014-01-30 07:07 velocity-1.5.jar > -rw-rw-r-- 1 94672 2014-01-30 07:08 xz-1.0.jar > -rw-rw-r-- 1 792964 2015-04-30 03:08 zookeeper-3.4.6.jar > > * and hadoop-common-2.6.0.jar have addDeprecations(DeprecationDelta[]) method. > <adddeprecations.png> > > > > 2016-02-12 17:29 GMT+09:00 Hitesh Shah <hit...@apache.org>: > It seems to me that the Tez AM classpath somehow has a hadoop-common jar that > does not have the Configuration.addDeprecations() api that YARN needs. > > For the Tez AM, the classpath is fully constructed based on the tez tarball ( > from HDFS using distributed cache ) and additional jars that Hive adds ( > hive-exec.jar, etc ). It does not use HADOOP_CLASSPATH or anything else from > the cluster nodes. HADOOP_CLASSPATH is only used on the client node where the > hive shell runs. Can you confirm that hive was also compiled against > hadoop-2.6.0 as that might be pulling in a different version of hadoop-common? > > thanks > — Hitesh > > On Feb 12, 2016, at 12:16 AM, no jihun <jees...@gmail.com> wrote: > > > Thanks Hitesh Shah. > > > > It claims > > > > 2016-02-12 14:59:07,388 [ERROR] [main] |app.DAGAppMaster|: Error starting > > DAGAppMaster > > > > java.lang.NoSuchMethodError: > > org.apache.hadoop.conf.Configuration.addDeprecations([Lorg/apache/hadoop/conf/Configuration$DeprecationDelta;)V > > at > > org.apache.hadoop.yarn.conf.YarnConfiguration.addDeprecatedKeys(YarnConfiguration.java:79) > > at > > org.apache.hadoop.yarn.conf.YarnConfiguration.<clinit>(YarnConfiguration.java:73) > > at org.apache.tez.dag.app.DAGAppMaster.main(DAGAppMaster.java:2271) > > > > I am not sure but according to this > > thread(http://grokbase.com/t/cloudera/cdh-user/12765svj61/libjars-and-hadoop-jar-command) > > this perhaps caused by $HADOOP_CLASSPATH problem. > > > > But I wander should I copy "tez-dist/target/tez-0.8.2" to all cluster then > > export below? > > export TEZ_JARS=/home1/apps/tez-0.8.2 > > export TEZ_CONF_DIR=$TEZ_JARS/conf > > export > > HADOOP_CLASSPATH=$TEZ_CONF_DIR:$TEZ_JARS/*:$TEZ_JARS/lib/*:$HADOOP_CLASSPATH > > > > I did this only on name nodes. > > > > > > 2016-02-12 16:47 GMT+09:00 Hitesh Shah <hit...@apache.org>: > > Run the following command: “bin/yarn logs -applicationId > > application_1452243782005_0292” . This should give you the logs for > > container_1452243782005_0292_02_000001 which may shed more light on why the > > Tez ApplicationMaster is failing to launch when triggered via Hive. > > > > thanks > > — Hitesh > > > > > > > > On Feb 11, 2016, at 10:48 PM, no jihun <jees...@gmail.com> wrote: > > > > > Hi all. > > > > > > When I execute a query on hive I got an error below.(so do in hive cli) > > > no more detailed log found. > > > > > > what should I check? > > > any advice will be appreciated. > > > > > > versions > > > - tez-0.8.2 > > > - hadoop 2.6.0 > > > > > > ----------------------------------- > > > > > > hive > set hive.execution.engine=tez; > > > hive > select count(*) from contents; > > > > > > WARNING: org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please > > > use org.apache.hadoop.log.metrics.EventCounter in all the > > > log4j.properties files. > > > > > > Logging initialized using configuration in > > > file:/home1/eco/hive/conf/hive-log4j.properties > > > hive> set hive.execution.engine=tez; > > > hive> select count(*) from agg_band_contents; > > > Query ID = irteam_20160212145903_9300f3b2-3942-4423-8586-73d2eaff9e58 > > > Total jobs = 1 > > > Launching Job 1 out of 1 > > > Exception in thread "Thread-10" java.lang.RuntimeException: > > > org.apache.tez.dag.api.SessionNotRunning: TezSession has already > > > shutdown. Application application_1452243782005_0292 failed 2 times due > > > to AM Container for appattempt_1452243782005_0292_000002 exited with > > > exitCode: 1 > > > For more detailed output, check application tracking > > > page:http://xstathn003:8088/proxy/application_1452243782005_0292/Then, > > > click on links to logs of each attempt. > > > Diagnostics: Exception from container-launch. > > > Container id: container_1452243782005_0292_02_000001 > > > Exit code: 1 > > > Stack trace: ExitCodeException exitCode=1: > > > at org.apache.hadoop.util.Shell.runCommand(Shell.java:538) > > > at org.apache.hadoop.util.Shell.run(Shell.java:455) > > > at > > > org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715) > > > at > > > org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211) > > > at > > > org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302) > > > at > > > org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82) > > > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > > > at > > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > > > at > > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > > > at java.lang.Thread.run(Thread.java:745) > > > > > > > > > Container exited with a non-zero exit code 1 > > > Failing this attempt. Failing the application. > > > at > > > org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:535) > > > at > > > org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:74) > > > Caused by: org.apache.tez.dag.api.SessionNotRunning: TezSession has > > > already shutdown. Application application_1452243782005_0292 failed 2 > > > times due to AM Container for appattempt_1452243782005_0292_000002 exited > > > with exitCode: 1 > > > For more detailed output, check application tracking > > > page:http://xstathn003:8088/proxy/application_1452243782005_0292/Then, > > > click on links to logs of each attempt. > > > Diagnostics: Exception from container-launch. > > > Container id: container_1452243782005_0292_02_000001 > > > Exit code: 1 > > > Stack trace: ExitCodeException exitCode=1: > > > at org.apache.hadoop.util.Shell.runCommand(Shell.java:538) > > > at org.apache.hadoop.util.Shell.run(Shell.java:455) > > > at > > > org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715) > > > at > > > org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211) > > > at > > > org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302) > > > at > > > org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82) > > > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > > > at > > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > > > at > > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > > > at java.lang.Thread.run(Thread.java:745) > > > > > > > > > Container exited with a non-zero exit code 1 > > > Failing this attempt. Failing the application. > > > at > > > org.apache.tez.client.TezClient.waitTillReady(TezClient.java:784) > > > at > > > org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:205) > > > at > > > org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:116) > > > at > > > org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:532) > > > ... 1 more > > > Interrupting... Be patient, this might take some time. > > > > > > > > > > > > > > > ------------------- > > > tez example.jar works well. > > > $ hadoop jar ~/apps/tez-0.8.2/tez-examples-0.8.2.jar orderedwordcount > > > /tmp/set_webapps.sh /tmp/set_webapps.result > > > $ ok. > > > > > > ------------------- > > > my setup. > > > > > > upload tez to hdfs. > > > > > > $ hadoop fs -mkdir /apps > > > $ hadoop fs -mkdir /apps/tez-0.8.2 > > > $ hadoop fs -put tez-dist/target/tez-0.8.2.tar.gz /apps/tez-0.8.2/ > > > > > > env export > > > export TEZ_JARS=/home1/irteam/apps/tez-0.8.2 > > > export TEZ_CONF_DIR=$TEZ_JARS/conf > > > export > > > HADOOP_CLASSPATH=$TEZ_CONF_DIR:$TEZ_JARS/*:$TEZ_JARS/lib/*:$HADOOP_CLASSPATH > > > export HADOOP_USER_CLASSPATH_FIRST=true > > > $ source ~/.bashrc > > > > > > $ mkdir tez-dist/target/tez-0.8.2/conf > > > $ vi tez-dist/target/tez-0.8.2/conf/tez-site.xml > > > <?xml version="1.0" encoding="UTF-8"?> > > > <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> > > > <configuration> > > > <property> > > > <name>tez.lib.uris</name> > > > <value>${fs.defaultFS}/apps/tez-0.8.2/tez-0.8.2.tar.gz</value> > > > </property> > > > </configuration> > > > > > > > > > > > > > > > > -- > > ---------------------------------------------- > > Jihun No ( 노지훈 ) > > ---------------------------------------------- > > Twitter : @nozisim > > Facebook : nozisim > > Website : http://jeesim2.godohosting.com > > --------------------------------------------------------------------------------- > > Market Apps : android market products. > > > > > -- > ---------------------------------------------- > Jihun No ( 노지훈 ) > ---------------------------------------------- > Twitter : @nozisim > Facebook : nozisim > Website : http://jeesim2.godohosting.com > --------------------------------------------------------------------------------- > Market Apps : android market products.