Please bear in mind that all I want is to test Hive with TEZ engine. I
already have Hive working OK with Spark 1.3.1 engine and I compiled it
spark from source code. so hopefully I can use TEZ as Spark engine as well.


thanks



Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 21 May 2016 at 00:39, Mich Talebzadeh <[email protected]> wrote:

> This is the instruction?
>
> Created by Hitesh Shah
> <https://cwiki.apache.org/confluence/display/~hitesh>, last modified on May
> 02, 2016
> <https://cwiki.apache.org/confluence/pages/diffpagesbyversion.action?pageId=62695071&selectedPageVersions=1&selectedPageVersions=2>
>  Go
> to start of metadata
> <https://cwiki.apache.org/confluence/display/TEZ/Tez+Release+FAQ#page-metadata-start>
> Making use of the Tez Binary Release tarball
>
>    -
>
>    If the binary tarball's name does not include anything referring to a
>    hadoop version, then this implies that the tarball was compiled against the
>    hadoop version that the Tez release compiles against by default. For
>    example, for 0.7.0 and 0.8.0, the default hadoop version used is 2.6.0 (
>    this can be found by looking for the hadoop.version property in the
>    top-level pom.xml in the source tarball for the release).
>    -
>
>    The tarball structure is as follows:
>
>    ? <https://cwiki.apache.org/confluence/display/TEZ/Tez+Release+FAQ#>
>    apache-tez-{x.y.z}/
>                      /tez*.jar
>                      /lib/*.jar
>                      /conf/tez*.xml.template
>                      /share/tez.tar.gz
>    - Set up Tez by following INSTALL.txt and use 
> apache-tez-{x.y.z}/share/tez.tar.gz
>    as the full tarball to be uploaded to HDFS.
>    - Use the config templates under apache-tez-{x.y.z}/conf/ to create
>    the tez-site.xml as needed in an appropriate conf directory. If you end up
>    using apache-tez-{x.y.z}/conf/, then do an export TEZ_CONF_DIR="
>    apache-tez-{x.y.z}/conf/"
>    - Add "apache-tez-{x.y.z}/*:apache-tez-{x.y.z}/lib/*:${TEZ_CONF_DIR}"
>    to HADOOP_CLASSPATH so as to get the tez client jars onto the classpath
>    invoked when using the "bin/hadoop jar" command to run an example job.
>
>
>
>
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> On 21 May 2016 at 00:37, Mich Talebzadeh <[email protected]>
> wrote:
>
>> Thanks both
>>
>> so this is the file that needs to go in hdfs correct?
>>
>> hduser@rhes564: /usr/lib/apache-tez-0.7.1-bin/share> ltr tez-0.7.1.tar.gz
>> -*rw-r--r-- 1 hduser hadoop 39694439 May  4 18:47 tez-0.7.1.tar.gz*
>>
>>
>> In hdfs I have now
>>
>> hduser@rhes564: /usr/lib/apache-tez-0.7.1-bin/share> hdfs dfs -ls
>> /usr/lib/apache-tez-0.7.1-bin
>>
>> -rw-r--r--   2 hduser supergroup   39694439 2016-05-21 00:31
>> /usr/lib/apache-tez-0.7.1-bin/tez-0.7.1.tar.gz
>>
>>
>> Now I only installed tez under
>>
>> /usr/lib/apache-tez-0.7.1-bin
>>
>> MY Hadoop is installed in
>>
>> echo $HADOOP_HOME
>> /home/hduser/hadoop-2.6.0
>>
>> and my xml-site in in
>>
>> $HADOOP_HOME/etc/hadoop
>>
>> OK and this is my sml-site content
>>
>> hduser@rhes564: /home/hduser/hadoop-2.6.0/etc/hadoop> cat tez-site.xml
>> <configuration>
>>   <property>
>>     <name>tez.version</name>
>>     <value>0.7.1</value>
>>   </property>
>>   <property>
>>     <name>tez.lib.uris</name>
>>
>> <value>/usr/lib/apache-tez-0.7.1-bin,/usr/lib/apache-tez-0.7.1-bin/lib</value>
>>   </property>
>> </configuration>
>>
>> Is the red correct please?
>>
>>
>>
>> Dr Mich Talebzadeh
>>
>>
>>
>> LinkedIn * 
>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>>
>> On 21 May 2016 at 00:24, Bikas Saha <[email protected]> wrote:
>>
>>> >> tez.lib.uris assumes that paths are based on the default fs and
>>> therefore if your setup is using HDFS as default, the paths /usr/lib would
>>> be invalid
>>>
>>> Are you sure? The below paths looks right to me except that the contents
>>> of the directories are wrong.
>>>
>>> <name>tez.lib.uris</name>
>>>
>>> <value>/usr/lib/apache-tez-0.7.1-bin,/usr/lib/apache-tez-0.7.1-bin/lib</value>
>>>
>>> hduser@rhes564: /usr/lib/apache-tez-0.7.1-bin> hdfs dfs -ls
>>> /usr/lib/apache-tez-0.7.1-bin Found 2 items
>>> -rw-r--r--   2 hduser supergroup   53092828 2016-05-20 23:15
>>> /usr/lib/apache-tez-0.7.1-bin/apache-tez-0.7.1-bin.tar.gz
>>> drwxr-xr-x   - hduser supergroup          0 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib
>>>
>>>
>>> -----Original Message-----
>>> From: Hitesh Shah [mailto:[email protected]]
>>> Sent: Friday, May 20, 2016 4:18 PM
>>> To: [email protected]
>>> Subject: Re: My first TEZ job fails
>>>
>>> Can you try the instructions mentioned at
>>> https://cwiki.apache.org/confluence/display/TEZ/Tez+Release+FAQ ?
>>>
>>> tez.lib.uris assumes that paths are based on the default fs and
>>> therefore if your setup is using HDFS as default, the paths /usr/lib would
>>> be invalid.
>>>
>>> — HItesh
>>>
>>> > On May 20, 2016, at 3:39 PM, Mich Talebzadeh <
>>> [email protected]> wrote:
>>> >
>>> > Still failing with /apache-tez-0.7.1-bin I am afraid.
>>> >
>>> > OK this is my tez-site.xml
>>> >
>>> > hduser@rhes564: /home/hduser/hadoop-2.6.0/etc/hadoop> cat tez-site.xml
>>> > <configuration>
>>> >   <property>
>>> >     <name>tez.version</name>
>>> >     <value>0.7.1</value>
>>> >   </property>
>>> >   <property>
>>> >     <name>tez.lib.uris</name>
>>> >
>>>  
>>> <value>/usr/lib/apache-tez-0.7.1-bin,/usr/lib/apache-tez-0.7.1-bin/lib</value>
>>> >   </property>
>>> > </configuration>
>>> >
>>> > This is what I have put in hdfs directory
>>> >
>>> > hduser@rhes564: /usr/lib/apache-tez-0.7.1-bin> hdfs dfs -ls
>>> > /usr/lib/apache-tez-0.7.1-bin Found 2 items
>>> > -rw-r--r--   2 hduser supergroup   53092828 2016-05-20 23:15
>>> /usr/lib/apache-tez-0.7.1-bin/apache-tez-0.7.1-bin.tar.gz
>>> > drwxr-xr-x   - hduser supergroup          0 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib
>>> >
>>> > Also I put all /usr/lib/apache-tez-0.7.1-bin/lib/*.jar in
>>> > /usr/lib/apache-tez-0.7.1-bin/lib
>>> >
>>> > hduser@rhes564: /usr/lib/apache-tez-0.7.1-bin> hdfs dfs -ls
>>> > /usr/lib/apache-tez-0.7.1-bin/lib Found 22 items
>>> > -rw-r--r--   2 hduser supergroup     124846 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/RoaringBitmap-0.4.9.jar
>>> > -rw-r--r--   2 hduser supergroup      41123 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/commons-cli-1.2.jar
>>> > -rw-r--r--   2 hduser supergroup      58160 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/commons-codec-1.4.jar
>>> > -rw-r--r--   2 hduser supergroup     588337 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/commons-collections-3.2.2.jar
>>> > -rw-r--r--   2 hduser supergroup     751238 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/commons-collections4-4.1.jar
>>> > -rw-r--r--   2 hduser supergroup     185140 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/commons-io-2.4.jar
>>> > -rw-r--r--   2 hduser supergroup     284220 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/commons-lang-2.6.jar
>>> > -rw-r--r--   2 hduser supergroup    1599627 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/commons-math3-3.1.1.jar
>>> > -rw-r--r--   2 hduser supergroup    1648200 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/guava-11.0.2.jar
>>> > -rw-r--r--   2 hduser supergroup     664918 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/hadoop-mapreduce-client-common-2.6.0.jar
>>> > -rw-r--r--   2 hduser supergroup    1509399 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/hadoop-mapreduce-client-core-2.6.0.jar
>>> > -rw-r--r--   2 hduser supergroup     130458 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/jersey-client-1.9.jar
>>> > -rw-r--r--   2 hduser supergroup     147952 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/jersey-json-1.9.jar
>>> > -rw-r--r--   2 hduser supergroup      81743 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/jettison-1.3.4.jar
>>> > -rw-r--r--   2 hduser supergroup     539912 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/jetty-6.1.26.jar
>>> > -rw-r--r--   2 hduser supergroup     177131 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/jetty-util-6.1.26.jar
>>> > -rw-r--r--   2 hduser supergroup      33031 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/jsr305-2.0.3.jar
>>> > -rw-r--r--   2 hduser supergroup     111908 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/metrics-core-3.1.0.jar
>>> > -rw-r--r--   2 hduser supergroup     533455 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/protobuf-java-2.5.0.jar
>>> > -rw-r--r--   2 hduser supergroup     105112 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/servlet-api-2.5.jar
>>> > -rw-r--r--   2 hduser supergroup      26084 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/slf4j-api-1.7.5.jar
>>> > -rw-r--r--   2 hduser supergroup       8869 2016-05-20 23:27
>>> /usr/lib/apache-tez-0.7.1-bin/lib/slf4j-log4j12-1.7.5.jar
>>> >
>>> > Still getting this error
>>> >
>>> > hduser@rhes564: /usr/lib/apache-tez-0.7.1-bin> hadoop jar
>>> > ./tez-examples-0.7.1.jar orderedwordcount /tmp/input/test.txt /tmp/out
>>> >
>>> > Application application_1463775706014_0008 failed 2 times due to Error
>>> > launching appattempt_1463775706014_0008_000002. Got exception:
>>> > org.apache.hadoop.ipc.RemoteException(java.lang.NoSuchMethodError):
>>> > org.apache.hadoop.yarn.proto.YarnProtos$ApplicationIdProto.hashLong(J)
>>> > I at
>>> > org.apache.hadoop.yarn.proto.YarnProtos$ApplicationIdProto.hashCode(Ya
>>> > rnProtos.java:2616) at
>>> > org.apache.hadoop.yarn.proto.YarnProtos$ApplicationAttemptIdProto.hash
>>> > Code(YarnProtos.java:3154) at
>>> > org.apache.hadoop.yarn.proto.YarnSecurityTokenProtos$NMTokenIdentifier
>>> > Proto.hashCode(YarnSecurityTokenProtos.java:410)
>>> > at
>>> > org.apache.hadoop.yarn.security.NMTokenIdentifier.hashCode(NMTokenIden
>>> > tifier.java:126) at java.util.HashMap.hash(HashMap.java:338)
>>> >
>>> > My protoc is 2.6.1 but everything else Hadoop, spark, hive, etx all
>>> work fine!
>>> >
>>> > hduser@rhes564: /usr/lib/apache-tez-0.7.1-bin> protoc --version
>>> > libprotoc 2.6.1
>>> >
>>> > Thanks
>>> >
>>> > Dr Mich Talebzadeh
>>> >
>>> > LinkedIn
>>> > https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCC
>>> > dOABUrV8Pw
>>> >
>>> > http://talebzadehmich.wordpress.com
>>> >
>>> >
>>> > On 20 May 2016 at 23:04, Mich Talebzadeh <[email protected]>
>>> wrote:
>>> > Thanks Bikas,
>>> >
>>> > Downloaded it and installed under /usr/lib/apache-tez-0.7.1-bin
>>> >
>>> > Are these env variables correct?
>>> >
>>> > export TEZ_HOME=/usr/lib/apache-tez-0.7.1-bin
>>> > export TEZ_JARS=/usr/lib/apache-tez-0.7.1-bin
>>> >
>>> >
>>> > Dr Mich Talebzadeh
>>> >
>>> > LinkedIn
>>> > https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCC
>>> > dOABUrV8Pw
>>> >
>>> > http://talebzadehmich.wordpress.com
>>> >
>>> >
>>> > On 20 May 2016 at 22:43, Bikas Saha <[email protected]> wrote:
>>> > Yes. Almost all Hadoop ecosystem components are on 2.5.1 for protoc.
>>> >
>>> >
>>> >
>>> > Btw, unless you need something specific from 0.8 branch and are on the
>>> right Hadoop version 2.6.0 or higher, you could use the recent 0.7.1 binary
>>> release so you wont have to build the code yourself. May save you from
>>> these mismatch issues for the most part.
>>> >
>>> >
>>> >
>>> > Bikas
>>> >
>>> >
>>> >
>>> > From: Mich Talebzadeh [mailto:[email protected]]
>>> > Sent: Friday, May 20, 2016 2:25 PM
>>> > To: [email protected]
>>> > Subject: Re: My first TEZ job fails
>>> >
>>> >
>>> >
>>> > the issue is that I have protobuf-2.6.1 and I believe I modified the
>>> pom.xml under apache-tez-0.8.3-src and changed it from 2.5.0 to 2.6.1.
>>> >
>>> >
>>> >
>>> > Now I have to install protobuf-2.5.0 and It is not working.
>>> >
>>> >
>>> >
>>> > Dr Mich Talebzadeh
>>> >
>>> > LinkedIn
>>> > https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCC
>>> > dOABUrV8Pw
>>> >
>>> > http://talebzadehmich.wordpress.com
>>> >
>>> >
>>> >
>>> > On 20 May 2016 at 21:46, Mich Talebzadeh <[email protected]>
>>> wrote:
>>> >
>>> > sounds like this may ve the issue
>>> >
>>> >
>>> >
>>> > java.lang.NoSuchMethodError:
>>> > org.apache.hadoop.yarn.proto.YarnProtos$ApplicationIdProto.hashLong(J)
>>> > I
>>> >
>>> >
>>> >
>>> > Dr Mich Talebzadeh
>>> >
>>> > LinkedIn
>>> > https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCC
>>> > dOABUrV8Pw
>>> >
>>> > http://talebzadehmich.wordpress.com
>>> >
>>> >
>>> >
>>> > On 20 May 2016 at 19:28, Mich Talebzadeh <[email protected]>
>>> wrote:
>>> >
>>> > Hi Hitesh,
>>> >
>>> >
>>> >
>>> > This is the content of tez-site.xml]
>>> >
>>> >
>>> >
>>> > hduser@rhes564: /home/hduser/hadoop-2.6.0/etc/hadoop> cat tez-site.xml
>>> > <configuration>
>>> >   <property>
>>> >     <name>tez.version</name>
>>> >     <value>0.8.3</value>
>>> >   </property>
>>> >
>>> >   <property>
>>> >     <name>tez.lib.uris</name>
>>> >     <value>/usr/lib/tez-0.8.3,/usr/lib/tez-0.8.3/lib/</value>
>>> >   </property>
>>> > </configuration>
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> > This is what is in hdfs
>>> >
>>> >
>>> >
>>> > drwxr-xr-x   - hduser supergroup          0 2016-05-20 19:25
>>> /usr/lib/tez-0.8.3/lib
>>> > -rw-r--r--   2 hduser supergroup   42821282 2016-05-20 09:03
>>> /usr/lib/tez-0.8.3/tez-0.8.3.tar.gz
>>> >
>>> >
>>> >
>>> >
>>> >
>>> > Tez is installed in
>>> >
>>> >
>>> >
>>> > cd $TEZ_HOME
>>> > hduser@rhes564: /usr/lib/tez-0.8.3>
>>> >
>>> >
>>> >
>>> > tez-site.xml is soft linked in $TEZ_HOME as follows
>>> >
>>> >
>>> >
>>> > hduser@rhes564: /usr/lib/tez-0.8.3> ls -l tez-site.xml lrwxrwxrwx 1
>>> > hduser hadoop 49 May 20 19:22 tez-site.xml ->
>>> > /home/hduser/hadoop-2.6.0/etc/hadoop/tez-site.xml
>>> >
>>> >
>>> >
>>> > This is error
>>> >
>>> >
>>> >
>>> > Application application_1463758195355_0004 failed 2 times due to Error
>>> > launching appattempt_1463758195355_0004_000002. Got exception:
>>> > org.apache.hadoop.ipc.RemoteException(java.lang.NoSuchMethodError):
>>> > org.apache.hadoop.yarn.proto.YarnProtos$ApplicationIdProto.hashLong(J)
>>> > I
>>> >
>>> > at
>>> > org.apache.hadoop.yarn.proto.YarnProtos$ApplicationIdProto.hashCode(Ya
>>> > rnProtos.java:2616)
>>> >
>>> > at
>>> > org.apache.hadoop.yarn.proto.YarnProtos$ApplicationAttemptIdProto.hash
>>> > Code(YarnProtos.java:3154)
>>> >
>>> > at
>>> > org.apache.hadoop.yarn.proto.YarnSecurityTokenProtos$NMTokenIdentifier
>>> > Proto.hashCode(YarnSecurityTokenProtos.java:410)
>>> >
>>> > at
>>> > org.apache.hadoop.yarn.security.NMTokenIdentifier.hashCode(NMTokenIden
>>> > tifier.java:126)
>>> >
>>> > at java.util.HashMap.hash(HashMap.java:338)
>>> >
>>> > at java.util.HashMap.put(HashMap.java:611)
>>> >
>>> > at java.util.HashSet.add(HashSet.java:219)
>>> >
>>> > at javax.security.auth.Subject$ClassSet.populateSet(Subject.java:1409)
>>> >
>>> > at javax.security.auth.Subject$ClassSet.<init>(Subject.java:1369)
>>> >
>>> > at javax.security.auth.Subject.getPublicCredentials(Subject.java:720)
>>> >
>>> > at
>>> > org.apache.hadoop.security.UserGroupInformation.getTokenIdentifiers(Us
>>> > erGroupInformation.java:1400)
>>> >
>>> > at
>>> > org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerMa
>>> > nagerImpl.selectNMTokenIdentifier(ContainerManagerImpl.java:618)
>>> >
>>> > at
>>> > org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerMa
>>> > nagerImpl.startContainers(ContainerManagerImpl.java:699)
>>> >
>>> > at
>>> > org.apache.hadoop.yarn.api.impl.pb.service.ContainerManagementProtocol
>>> > PBServiceImpl.startContainers(ContainerManagementProtocolPBServiceImpl
>>> > .java:60)
>>> >
>>> > at
>>> > org.apache.hadoop.yarn.proto.ContainerManagementProtocol$ContainerMana
>>> > gementProtocolService$2.callBlockingMethod(ContainerManagementProtocol
>>> > .java:95)
>>> >
>>> > at
>>> > org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call
>>> > (ProtobufRpcEngine.java:619)
>>> >
>>> > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
>>> >
>>> > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
>>> >
>>> > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
>>> >
>>> > at java.security.AccessController.doPrivileged(Native Method)
>>> >
>>> > at javax.security.auth.Subject.doAs(Subject.java:422)
>>> >
>>> > at
>>> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformat
>>> > ion.java:1628)
>>> >
>>> > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
>>> >
>>> > at org.apache.hadoop.ipc.Client.call(Client.java:1468)
>>> >
>>> > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>> >
>>> > at
>>> > org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngi
>>> > ne.java:232)
>>> >
>>> > at com.sun.proxy.$Proxy80.startContainers(Unknown Source)
>>> >
>>> > at
>>> > org.apache.hadoop.yarn.api.impl.pb.client.ContainerManagementProtocolP
>>> > BClientImpl.startContainers(ContainerManagementProtocolPBClientImpl.ja
>>> > va:96)
>>> >
>>> > at
>>> > org.apache.hadoop.yarn.server.resourcemanager.amlauncher.AMLauncher.la
>>> > unch(AMLauncher.java:119)
>>> >
>>> > at
>>> > org.apache.hadoop.yarn.server.resourcemanager.amlauncher.AMLauncher.ru
>>> > n(AMLauncher.java:254)
>>> >
>>> > at
>>> > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.j
>>> > ava:1142)
>>> >
>>> > at
>>> > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.
>>> > java:617)
>>> >
>>> > at java.lang.Thread.run(Thread.java:745)
>>> >
>>> > . Failing the application.
>>> >
>>> > Thanks
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> > Dr Mich Talebzadeh
>>> >
>>> > LinkedIn
>>> > https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCC
>>> > dOABUrV8Pw
>>> >
>>> > http://talebzadehmich.wordpress.com
>>> >
>>> >
>>> >
>>> > On 20 May 2016 at 19:06, Hitesh Shah <[email protected]> wrote:
>>> >
>>> > Logs from `bin/yarn logs -applicationId
>>> application_1463758195355_0002` would be more useful to debug your setup
>>> issue. The RM logs usually do not shed much light on why an application
>>> failed.
>>> > Can you confirm that you configured tez.lib.uris correctly to point to
>>> the tez tarball on HDFS (tez tar should be the one obtained from
>>> tez-dist/target/tez-0.8.3.tar.gz) ?
>>> >
>>> > — Hitesh
>>> >
>>> >
>>> > > On May 20, 2016, at 10:24 AM, Mich Talebzadeh <
>>> [email protected]> wrote:
>>> > >
>>> > > Hi,
>>> > >
>>> > > I have just compiled and installed TEZ, trying to do a test with
>>> > >
>>> > > hadoop jar ./tez-examples-0.8.3.jar orderedwordcount
>>> > > /tmp/input/test.txt /tmp/out
>>> > >
>>> > > The job fails as follows. This is from yarn log
>>> > >
>>> > > 2016-05-20 18:19:26,945 INFO
>>> > > SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for
>>> > > appattempt_1463758195355_0002_000001 (auth:SIMPLE)
>>> > > 2016-05-20 18:19:26,950 WARN org.apache.hadoop.ipc.Server: IPC
>>> > > Server handler 0 on 59093, call
>>> > > org.apache.hadoop.yarn.api.ContainerManagementProtocolPB.startContai
>>> > > ners from 50.140.197.217:46784 Call#2 Retry#0
>>> > > java.lang.NoSuchMethodError:
>>> org.apache.hadoop.yarn.proto.YarnProtos$ApplicationIdProto.hashLong(J)I
>>> > >         at
>>> org.apache.hadoop.yarn.proto.YarnProtos$ApplicationIdProto.hashCode(YarnProtos.java:2616)
>>> > >         at
>>> org.apache.hadoop.yarn.proto.YarnProtos$ApplicationAttemptIdProto.hashCode(YarnProtos.java:3154)
>>> > >         at
>>> org.apache.hadoop.yarn.proto.YarnSecurityTokenProtos$NMTokenIdentifierProto.hashCode(YarnSecurityTokenProtos.java:410)
>>> > >         at
>>> org.apache.hadoop.yarn.security.NMTokenIdentifier.hashCode(NMTokenIdentifier.java:126)
>>> > >         at java.util.HashMap.hash(HashMap.java:338)
>>> > >         at java.util.HashMap.put(HashMap.java:611)
>>> > >         at java.util.HashSet.add(HashSet.java:219)
>>> > >         at
>>> javax.security.auth.Subject$ClassSet.populateSet(Subject.java:1409)
>>> > >         at
>>> javax.security.auth.Subject$ClassSet.<init>(Subject.java:1369)
>>> > >         at
>>> javax.security.auth.Subject.getPublicCredentials(Subject.java:720)
>>> > >         at
>>> org.apache.hadoop.security.UserGroupInformation.getTokenIdentifiers(UserGroupInformation.java:1400)
>>> > >         at
>>> org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl.selectNMTokenIdentifier(ContainerManagerImpl.java:618)
>>> > >         at
>>> org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl.startContainers(ContainerManagerImpl.java:699)
>>> > >         at
>>> org.apache.hadoop.yarn.api.impl.pb.service.ContainerManagementProtocolPBServiceImpl.startContainers(ContainerManagementProtocolPBServiceImpl.java:60)
>>> > >         at
>>> org.apache.hadoop.yarn.proto.ContainerManagementProtocol$ContainerManagementProtocolService$2.callBlockingMethod(ContainerManagementProtocol.java:95)
>>> > >         at
>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
>>> > >         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
>>> > >         at
>>> org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
>>> > >         at
>>> org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
>>> > >         at java.security.AccessController.doPrivileged(Native Method)
>>> > >         at javax.security.auth.Subject.doAs(Subject.java:422)
>>> > >         at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>>> > >         at
>>> > > org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
>>> > > 2016-05-20 18:19:27,929 WARN
>>> > > org.apache.hadoop.yarn.server.nodemanager.containermanager.Container
>>> > > ManagerImpl: Event EventType: KILL_CONTAINER sent to absent
>>> > > container container_1463758195355_0002_01_000001
>>> > > 2016-05-20 18:19:27,944 INFO
>>> > > SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for
>>> > > appattempt_1463758195355_0002_000002 (auth:SIMPLE)
>>> > > 2016-05-20 18:19:27,949 WARN org.apache.hadoop.ipc.Server: IPC
>>> > > Server handler 0 on 59093, call
>>> > > org.apache.hadoop.yarn.api.ContainerManagementProtocolPB.startContai
>>> > > ners from 50.140.197.217:46785 Call#3 Retry#0
>>> > > java.lang.NoSuchMethodError:
>>> org.apache.hadoop.yarn.proto.YarnProtos$ApplicationIdProto.hashLong(J)I
>>> > >         at
>>> org.apache.hadoop.yarn.proto.YarnProtos$ApplicationIdProto.hashCode(YarnProtos.java:2616)
>>> > >         at
>>> org.apache.hadoop.yarn.proto.YarnProtos$ApplicationAttemptIdProto.hashCode(YarnProtos.java:3154)
>>> > >         at
>>> org.apache.hadoop.yarn.proto.YarnSecurityTokenProtos$NMTokenIdentifierProto.hashCode(YarnSecurityTokenProtos.java:410)
>>> > >         at
>>> org.apache.hadoop.yarn.security.NMTokenIdentifier.hashCode(NMTokenIdentifier.java:126)
>>> > >         at java.util.HashMap.hash(HashMap.java:338)
>>> > >         at java.util.HashMap.put(HashMap.java:611)
>>> > >         at java.util.HashSet.add(HashSet.java:219)
>>> > >         at
>>> javax.security.auth.Subject$ClassSet.populateSet(Subject.java:1409)
>>> > >         at
>>> javax.security.auth.Subject$ClassSet.<init>(Subject.java:1369)
>>> > >         at
>>> javax.security.auth.Subject.getPublicCredentials(Subject.java:720)
>>> > >         at
>>> org.apache.hadoop.security.UserGroupInformation.getTokenIdentifiers(UserGroupInformation.java:1400)
>>> > >         at
>>> org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl.selectNMTokenIdentifier(ContainerManagerImpl.java:618)
>>> > >         at
>>> org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl.startContainers(ContainerManagerImpl.java:699)
>>> > >         at
>>> org.apache.hadoop.yarn.api.impl.pb.service.ContainerManagementProtocolPBServiceImpl.startContainers(ContainerManagementProtocolPBServiceImpl.java:60)
>>> > >         at
>>> org.apache.hadoop.yarn.proto.ContainerManagementProtocol$ContainerManagementProtocolService$2.callBlockingMethod(ContainerManagementProtocol.java:95)
>>> > >         at
>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
>>> > >         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
>>> > >         at
>>> org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
>>> > >         at
>>> org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
>>> > >         at java.security.AccessController.doPrivileged(Native Method)
>>> > >         at javax.security.auth.Subject.doAs(Subject.java:422)
>>> > >         at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>>> > >         at
>>> > > org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
>>> > > 2016-05-20 18:19:28,931 WARN
>>> > > org.apache.hadoop.yarn.server.nodemanager.containermanager.Container
>>> > > ManagerImpl: Event EventType: KILL_CONTAINER sent to absent
>>> > > container container_1463758195355_0002_02_000001
>>> > > 2016-05-20 18:19:28,932 WARN
>>> > > org.apache.hadoop.yarn.server.nodemanager.containermanager.Container
>>> > > ManagerImpl: Event EventType: FINISH_APPLICATION sent to absent
>>> > > application application_1463758195355_0002
>>> > >
>>> > > Any ideas will be appreciated!
>>> > >
>>> > > Thanks
>>> > >
>>> > > Dr Mich Talebzadeh
>>> > >
>>> > > LinkedIn
>>> > > https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcP
>>> > > CCdOABUrV8Pw
>>> > >
>>> > > http://talebzadehmich.wordpress.com
>>> > >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>>
>>>
>>>
>>>
>>
>

Reply via email to