Hi Hitesh,
 
Thanks for your time.
 
hdfs-site.xml and core-site.xml on the client are the same as on the cluster.
yarn.application.classpath is seted in yarn-site.xml,  which includes these two 
files and lib jars. because non-tez application can run well, so this 
configuration is correct.
 
 <property>
    <name>yarn.application.classpath</name>
    
<value>$HADOOP_CONF_DIR,$HADOOP_COMMON_HOME/*,$HADOOP_COMMON_HOME/lib/*</value> 
</property>
 
I try to replace these values with the absolute path, Tez also doesn't work.
 

 
> Subject: Re: Cannot submit Tez application
> From: hit...@apache.org
> Date: Sun, 11 May 2014 08:04:17 -0700
> To: dev@tez.incubator.apache.org
> 
> @Azury Yu, a question - is the hdfs-site and core-site that is configured on 
> the cluster different that the one in use on the client? Also, is 
> yarn.application.classpath set up such that the cluster’s hdfs-site and 
> core-site are not in that class path? 
> 
> At the moment, it seems like there is a difference which is getting fixed by 
> having the configs from the client be pushed to the cluster via local 
> resources.
> 
> thanks
> ― Hitesh
> 
> 
> 
> 
> On May 10, 2014, at 11:44 PM, AzuryYu <azur...@outlook.com> wrote:
> 
> > Hi,
> > I resolved this issue, but It looks strange.
> > 
> > I upload hdfs-site.xml and core-site.xml to tez-lib-uris on HDFS.
> > 
> > 
> > 
> >> From: azur...@outlook.com
> >> To: dev@tez.incubator.apache.org
> >> Subject: RE: Cannot submit Tez application
> >> Date: Sun, 11 May 2014 04:41:07 +0000
> >> 
> >> After look through the log, 
> >> It was: 
> >> 2014-05-11 12:24:54,590 FATAL [main] org.apache.tez.dag.app.DAGAppMaster: 
> >> Error starting DAGAppMaster
> >> java.lang.IllegalArgumentException: java.net.UnknownHostException: 
> >> test-cluster
> >> at 
> >> org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:377)
> >> at 
> >> org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:240)
> >> at 
> >> org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:144)
> >> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:595)
> >> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:540)
> >> at 
> >> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:140)
> >> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2402)
> >> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
> >> at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2436)
> >> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2418)
> >> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
> >> at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
> >> at org.apache.tez.dag.app.DAGAppMaster.serviceInit(DAGAppMaster.java:387)
> >> 
> >> 
> >> I enabled HDFS HA, test-cluster is my cluster id, so how to solve this 
> >> UnknownHostException? Thanks.
> >> 
> >> 
> >> 
> >>> From: azur...@outlook.com
> >>> To: dev@tez.incubator.apache.org
> >>> Subject: Cannot submit Tez application
> >>> Date: Sun, 11 May 2014 03:26:11 +0000
> >>> 
> >>> Hi,
> >>> 
> >>> I built Tez-0.5 against hadoop-2.4.0, and I've put all jars to the HDFS, 
> >>> configured tez-site.xml correctly.
> >>> 
> >>> Hive version is 0.13.0, I've set hive.execution.engine to tez in 
> >>> hive-site.xml
> >>> 
> >>> then I submit the hive query, got the following exception:
> >>> 
> >>> 2014-05-11 11:13:08,533 ERROR [main]: exec.Task 
> >>> (TezTask.java:execute(185)) - Failed to execute tez graph.
> >>> java.lang.IllegalStateException: Vertex: Reducer 4 already has group 
> >>> input with name:Union 3
> >>>        at org.apache.tez.dag.api.Vertex.addGroupInput(Vertex.java:250)
> >>>        at org.apache.tez.dag.api.DAG.processEdgesAndGroups(DAG.java:223)
> >>>        at org.apache.tez.dag.api.DAG.verify(DAG.java:284)
> >>>        at org.apache.tez.dag.api.DAG.createDag(DAG.java:462)
> >>>        at org.apache.tez.client.TezSession.submitDAG(TezSession.java:216)
> >>>        at org.apache.tez.client.TezSession.submitDAG(TezSession.java:155)
> >>>        at 
> >>> org.apache.hadoop.hive.ql.exec.tez.TezTask.submit(TezTask.java:320)
> >>>        at 
> >>> org.apache.hadoop.hive.ql.exec.tez.TezTask.execute(TezTask.java:165)
> >>>        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
> >>>        at 
> >>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
> >>>        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503)
> >>>        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270)
> >>>        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)
> >>>        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
> >>>        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)
> >>>        at 
> >>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
> >>>        at 
> >>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
> >>>        at 
> >>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
> >>>        at 
> >>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
> >>>        at 
> >>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:742)
> >>>        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
> >>>        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
> >>>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>>        at 
> >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >>>        at 
> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >>>        at java.lang.reflect.Method.invoke(Method.java:606)
> >>>        at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> >>> 2014-05-11 11:13:08,554 ERROR [main]: ql.Driver 
> >>> (SessionState.java:printError(545)) - FAILED: Execution Error, return 
> >>> code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask
> >>>                                     
> >>                                      
> >                                       
> 
                                          

Reply via email to