Any suggestion related to this ?

Thanks,
robert



On Wednesday, June 18, 2014 11:56 PM, Grandl Robert <[email protected]> 
wrote:
 


I am using 2.4 for client as well. Actually I took a tar gz from 
http://apache.petsads.us/hadoop/common/hadoop-2.4.0/,

and I am trying to run. I tried even one node. I am lack of ideas why this 
happens. 

 



On Wednesday, June 18, 2014 11:52 PM, Jian He <[email protected]> wrote:



This new method crossPlatformifyMREnv  is newly added in 2.4.0 release, which 
version of MR client are you using?
can you make sure you have the same version of client jars

Jian



On Wed, Jun 18, 2014 at 10:37 PM, Grandl Robert <[email protected]> 
wrote:

Hi guys,
>
>I don't know what I did but my hadoop yarn went crazy. I am not able to submit 
>any job, as it throws the following exception. 
>
>4/06/18 22:25:19 INFO mapreduce.JobSubmitter: number of splits:1
>14/06/18 22:25:19 INFO mapreduce.JobSubmitter: Submitting tokens for job: 
>job_1403155404621_0001
>14/06/18 22:25:19 INFO mapreduce.JobSubmitter: Cleaning up the staging area 
>/tmp/hadoop-yarn/staging/hadoop/.staging/job_1403155404621_0001
>java.lang.NoSuchMethodError: 
>org.apache.hadoop.mapreduce.v2.util.MRApps.crossPlatformifyMREnv(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/yarn/api/ApplicationConstants$Environment;)Ljava/lang/String;
>    at 
>org.apache.hadoop.mapred.YARNRunner.createApplicationSubmissionContext(YARNRunner.java:390)
>    at org.apache.hadoop.mapred.YARNRunner.submitJob(YARNRunner.java:284)
>    at 
>org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:430)
>    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
>    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
>    at java.security.AccessController.doPrivileged(Native Method)
>    at javax.security.auth.Subject.doAs(Subject.java:415)
>    at 
>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
>    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
>    at 
>org.apache.hadoop.examples.QuasiMonteCarlo.estimatePi(QuasiMonteCarlo.java:306)
>    at org.apache.hadoop.examples.QuasiMonteCarlo.run(QuasiMonteCarlo.java:354)
>    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>    at 
>org.apache.hadoop.examples.QuasiMonteCarlo.main(QuasiMonteCarlo.java:363)
>    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>    at 
>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>    at 
>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>    at java.lang.reflect.Method.invoke(Method.java:601)
>    at 
>org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
>   
>
>I configured all class path and variables as such:
>export HADOOP_COMMON_HOME=/home/hadoop/hadoop-2.4.0
>export HADOOP_HOME=$HADOOP_COMMON_HOME
>export HADOOP_HDFS_HOME=$HADOOP_COMMON_HOME
>export HADOOP_MAPRED_HOME=$HADOOP_COMMON_HOME
>export HADOOP_YARN_HOME=$HADOOP_COMMON_HOME
>export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_MAPRED_HOME/lib/native/
>export HADOOP_CONF_DIR=/home/hadoop/rgrandl/conf/
>export YARN_CONF_DIR=/home/hadoop/rgrandl/conf/
>export HADOOP_BIN_PATH=$HADOOP_MAPRED_HOME/bin/
>export HADOOP_SBIN=$HADOOP_MAPRED_HOME/sbin/
>export HADOOP_LOGS=$HADOOP_HOME/logs
>export HADOOP_LOG_DIR=$HADOOP_HOME/logs
>export YARN_LOG_DIR=$HADOOP_HOME/logs
>
>export JAVA_HOME=/home/hadoop/rgrandl/java/
>export HADOOP_USER_CLASSPATH_FIRST=1
>export YARN_HOME=/home/hadoop/hadoop-2.4.0
>export TEZ_CONF_DIR=/home/hadoop/rgrandl/conf
>export TEZ_JARS=/home/hadoop/rgrandl/tez/tez-0.4.0-incubating
>
>export HADOOP_PREFIX=$HADOOP_COMMON_HOME
>
>export 
>HADOOP_CLASSPATH=$HADOOP_HOME:/home/hadoop/rgrandl/tez/tez-0.4.0-incubating/*:/home/hadoop/rgrandl/tez/tez-0.4.0-incubating/lib/*:/home/hadoop/rgrandl/hive:/home/hadoop/rgrandl/conf
>
>export 
>PATH=$PATH:$HADOOP_BIN_PATH:$HADOOP_SBIN:$YARN_CONF_DIR:$HADOOP_YARN_HOME:$HADOOP_MAPRED_HOME:$HADOOP_HDFS_HOME:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME:$JAVA_HOME/bin/:/home/hadoop/rgrandl/hive/bin
>
>
>Everything seems to be correct, but I cannot understand this error. Is 
>something I never encountered before.
>
>
>Do you have any hints on it ?
>
>Thanks,
>robert

CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader of 
this message is not the intended recipient, you are hereby notified that any 
printing, copying, dissemination, distribution, disclosure or forwarding of 
this communication is strictly prohibited. If you have received this 
communication in error, please contact the sender immediately and delete it 
from your system. Thank You.

Reply via email to