[ 
https://issues.apache.org/jira/browse/OOZIE-2764?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15784803#comment-15784803
 ] 

Pierre Beauvois commented on OOZIE-2764:
----------------------------------------

I faced the same issue. After a few searches, I found the source. The oozie 
compilation produced Hadoop 2.6.0 jars even when specifying Hadoop 2.7.3 with 
the option "-Dhadoop.version=2.7.3".

Please do the following:

1. unjar the oozie.war (do this part in a new directory)

{noformat}jar -xvf oozie.war{noformat}

2. delete all Hadoop jars (I did the same for Hive jars) in 
oozie.war/WEB-INF/lib. In my case, I compiled Oozie for Hadoop 2.7.3 but Hadoop 
2.6.0 jars were in the directory.

{noformat}rm -f oozie.war/WEB-INF/lib/hadoop-*.jar
rm -f oozie.war/WEB-INF/lib/hive-*.jar{noformat}

3. create a new oozie.war (you must be in the oozie.war directory)

{noformat}jar -cvf oozie.war ./*{noformat}

4. regenerate the oozie.war binaries for oozie with a prepare-war

{noformat}oozie-setup.sh prepare-war{noformat}

5. the error is no more visible

Note: you'll have to copy several Hadoop and Hive jars in your libext 
directory. My libext directory looks like that:

{noformat}
activation-1.1.jar,apacheds-i18n-2.0.0-M15.jar,apacheds-kerberos-codec-2.0.0-M15.jar,api-asn1-api-1.0.0-M20.jar,api-util-1.0.0-M20.jar,avro-1.7.4.jar,commons-beanutils-core-1.8.0.jar,commons-cli-1.2.jar,commons-codec-1.4.jar,commons-collections-3.2.2.jar,commons-compress-1.4.1.jar,commons-configuration-1.6.jar,commons-digester-1.8.jar,commons-httpclient-3.1.jar,commons-io-2.4.jar,commons-lang-2.6.jar,commons-logging-1.1.3.jar,commons-math3-3.1.1.jar,commons-net-3.1.jar,curator-client-2.7.1.jar,curator-framework-2.7.1.jar,curator-recipes-2.7.1.jar,ext-2.2.zip,gson-2.2.4.jar,guava-11.0.2.jar,hadoop-annotations-2.7.3.jar,hadoop-auth-2.7.3.jar,hadoop-common-2.7.3.jar,hadoop-hdfs-2.7.3.jar,hadoop-mapreduce-client-app-2.7.3.jar,hadoop-mapreduce-client-common-2.7.3.jar,hadoop-mapreduce-client-core-2.7.3.jar,hadoop-mapreduce-client-hs-2.7.3.jar,hadoop-mapreduce-client-jobclient-2.7.3.jar,hadoop-mapreduce-client-shuffle-2.7.3.jar,hadoop-yarn-api-2.7.3.jar,hadoop-yarn-client-2.7.3.jar,hadoop-yarn-common-2.7.3.jar,hadoop-yarn-server-applicationhistoryservice-2.7.3.jar,hadoop-yarn-server-common-2.7.3.jar,hadoop-yarn-server-resourcemanager-2.7.3.jar,hadoop-yarn-server-web-proxy-2.7.3.jar,hive-ant-2.1.0.jar,hive-common-2.1.0.jar,hive-exec-2.1.0.jar,hive-hcatalog-core-2.1.0.jar,hive-jdbc-2.1.0.jar,hive-metastore-2.1.0.jar,hive-serde-2.1.0.jar,hive-service-2.1.0.jar,hive-shims-0.23-2.1.0.jar,hive-shims-0.23.jar,hive-shims-2.1.0.jar,hive-shims-common-2.1.0.jar,hive-webhcat-java-client-2.1.0.jar,htrace-core-3.1.0-incubating.jar,httpclient-4.2.5.jar,httpcore-4.2.5.jar,jackson-core-asl-1.9.13.jar,jackson-jaxrs-1.9.13.jar,jackson-mapper-asl-1.9.13.jar,jackson-xc-1.9.13.jar,jaxb-api-2.2.2.jar,jaxb-impl-2.2.3-1.jar,jersey-client-1.9.jar,jersey-core-1.9.jar,jetty-6.1.26.jar,jetty-util-6.1.26.jar,jsr305-3.0.0.jar,leveldbjni-all-1.8.jar,libfb303-0.9.3.jar,log4j-1.2.17.jar,mysql-connector-java-5.1.25-bin.jar,netty-3.6.2.Final.jar,paranamer-2.3.jar,protobuf-java-2.5.0.jar,servlet-api-2.5.jar,slf4j-api-1.7.10.jar,slf4j-log4j12-1.7.10.jar,snappy-java-1.0.4.1.jar,stax-api-1.0-2.jar,xercesImpl-2.9.1.jar,xml-apis-1.3.04.jar,xmlenc-0.52.jar,xz-1.0.jar,zookeeper-3.4.6.jar
{noformat}

When you do a "prepare-war", the libext content is copied to 
oozie-server/webapps/oozie/WEB-INF/lib/

So the issue is finally on the compilation process. Oozie doesn't create the 
right jars.

> Action failures related to HADOOP_CLASSPATH when using Hadoop 2.7.3
> -------------------------------------------------------------------
>
>                 Key: OOZIE-2764
>                 URL: https://issues.apache.org/jira/browse/OOZIE-2764
>             Project: Oozie
>          Issue Type: Bug
>          Components: action, core
>    Affects Versions: 4.3.0
>         Environment: Hadoop 2.7.3, Oozie 4.3.0, Hive 2.1.0, Pig 0.16.0, Spark 
> 1.6.1
>            Reporter: Alexandre Linte
>            Priority: Blocker
>
> Every action tested (MapReduce, Java, Spark, Hive, Pig) are failing with a 
> "java.lang.NoSuchFieldError: HADOOP_CLASSPATH". 
> Oozie has been compiled with the command:
> {noformat}
> $ bin/mkdistro.sh -DskipTests -Phadoop-2 -Dhadoop.version=2.7.3 
> -Dpig.version=0.16.0 -Dspark.version=1.6.1
> {noformat}
> The error stacktrace is below:
> {noformat}
> 2016-12-26T14:11:16+01:00 oozie01.bigdata.fr oozie INFO - 
> org.apache.oozie.service.CoordMaterializeTriggerService$CoordMaterializeTriggerRunnableUSER[-]
>  GROUP[-] TOKEN[-] APP[-] JOB[-] ACTION[-] CoordMaterializeTriggerService - 
> Curr Date= 2016-12-26T13:16Z, Num jobs to materialize = 0
> 2016-12-26T14:11:16+01:00 oozie01.bigdata.fr oozie INFO - 
> org.apache.oozie.service.CoordMaterializeTriggerService$CoordMaterializeTriggerRunnableUSER[-]
>  GROUP[-] TOKEN[-] APP[-] JOB[-] ACTION[-] Released lock for 
> [org.apache.oozie.service.CoordMaterializeTriggerService]
> 2016-12-26T14:11:16+01:00 oozie01.bigdata.fr oozie INFO - 
> org.apache.oozie.service.StatusTransitService$StatusTransitRunnableReleased 
> lock for [org.apache.oozie.service.StatusTransitService]
> 2016-12-26T14:11:16+01:00 oozie01.bigdata.fr oozie INFO - 
> org.apache.oozie.command.PurgeXCommandUSER[-] GROUP[-] TOKEN[-] APP[-] JOB[-] 
> ACTION[-] STARTED Purge to purge Workflow Jobs older than [30] days, 
> Coordinator Jobs older than [7] days, and Bundlejobs older than [7] days.
> 2016-12-26T14:11:16+01:00 oozie01.bigdata.fr oozie INFO - 
> org.apache.oozie.command.PurgeXCommandUSER[-] GROUP[-] TOKEN[-] APP[-] JOB[-] 
> ACTION[-] ENDED Purge deleted [0] workflows, [0] coordinatorActions, [0] 
> coordinators, [0] bundles
> 2016-12-26T14:11:16+01:00 oozie01.bigdata.fr oozie INFO - 
> org.apache.oozie.service.PauseTransitServiceReleased lock for 
> [org.apache.oozie.service.PauseTransitService]
> 2016-12-26T14:11:17+01:00 oozie01.bigdata.fr oozie INFO - 
> org.apache.oozie.command.wf.ActionStartXCommandUSER[shfs3453] GROUP[-] 
> TOKEN[] APP[SparkPi-test] JOB[0000000-161226114905740-oozie-W] 
> ACTION[0000000-161226114905740-oozie-W@spark-node] Start action 
> [0000000-161226114905740-oozie-W@spark-node] with user-retry state : 
> userRetryCount [0], userRetryMax [3], userRetryInterval [1]
> 2016-12-26T14:11:17+01:00 oozie01.bigdata.fr oozie INFO - 
> org.apache.oozie.service.HadoopAccessorServiceUSER[shfs3453] GROUP[-] TOKEN[] 
> APP[SparkPi-test] JOB[0000000-161226114905740-oozie-W] 
> ACTION[0000000-161226114905740-oozie-W@spark-node] Processing configuration 
> file [/opt/application/Oozie/current/conf/action-conf/default.xml] for action 
> [default] and hostPort [*]
> 2016-12-26T14:11:17+01:00 oozie01.bigdata.fr oozie INFO - 
> org.apache.oozie.service.HadoopAccessorServiceUSER[shfs3453] GROUP[-] TOKEN[] 
> APP[SparkPi-test] JOB[0000000-161226114905740-oozie-W] 
> ACTION[0000000-161226114905740-oozie-W@spark-node] Processing configuration 
> file [/opt/application/Oozie/current/conf/action-conf/spark.xml] for action 
> [spark] and hostPort [*]
> 2016-12-26T14:11:22+01:00 oozie01.bigdata.fr oozie INFO - 
> org.apache.oozie.service.HadoopAccessorServiceUSER[shfs3453] GROUP[-] TOKEN[] 
> APP[SparkPi-test] JOB[0000000-161226114905740-oozie-W] 
> ACTION[0000000-161226114905740-oozie-W@spark-node] Delegation Token Renewer 
> details: 
> Principal=rm/[email protected],Target=sandbox-RMS:8032,Renewer=rm/[email protected]
> 2016-12-26T14:11:22+01:00 oozie01.bigdata.fr oozie WARN - 
> org.apache.hadoop.mapreduce.JobResourceUploaderHadoop command-line option 
> parsing not performed. Implement the Tool interface and execute your 
> application with ToolRunner to remedy this.
> 2016-12-26T14:11:22+01:00 oozie01.bigdata.fr oozie WARN - 
> org.apache.hadoop.mapreduce.JobResourceUploaderNo job jar file set.  User 
> classes may not be found. See Job or Job#setJar(String).
> 2016-12-26T14:11:23+01:00 oozie01.bigdata.fr oozie ERROR - 
> org.apache.oozie.command.wf.ActionStartXCommandUSER[shfs3453] GROUP[-] 
> TOKEN[] APP[SparkPi-test] JOB[0000000-161226114905740-oozie-W] 
> ACTION[0000000-161226114905740-oozie-W@spark-node] Error,
> 2016-12-26T14:11:23.573405+01:00 localhost java.lang.NoSuchFieldError: 
> HADOOP_CLASSPATH
> 2016-12-26T14:11:23.573534+01:00 localhost     at 
> org.apache.hadoop.mapreduce.v2.util.MRApps.setClasspath(MRApps.java:248)
> 2016-12-26T14:11:23.573571+01:00 localhost     at 
> org.apache.hadoop.mapred.YARNRunner.createApplicationSubmissionContext(YARNRunner.java:458)
> 2016-12-26T14:11:23.573602+01:00 localhost     at 
> org.apache.hadoop.mapred.YARNRunner.submitJob(YARNRunner.java:285)
> 2016-12-26T14:11:23.573633+01:00 localhost     at 
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:240)
> 2016-12-26T14:11:23.573662+01:00 localhost     at 
> org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
> 2016-12-26T14:11:23.573691+01:00 localhost     at 
> org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
> 2016-12-26T14:11:23.573720+01:00 localhost     at 
> java.security.AccessController.doPrivileged(Native Method)
> 2016-12-26T14:11:23.573754+01:00 localhost     at 
> javax.security.auth.Subject.doAs(Subject.java:415)
> 2016-12-26T14:11:23.573783+01:00 localhost     at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
> 2016-12-26T14:11:23.573813+01:00 localhost     at 
> org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
> 2016-12-26T14:11:23.573842+01:00 localhost     at 
> org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:575)
> 2016-12-26T14:11:23.573870+01:00 localhost     at 
> org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:570)
> 2016-12-26T14:11:23.573899+01:00 localhost     at 
> java.security.AccessController.doPrivileged(Native Method)
> 2016-12-26T14:11:23.573928+01:00 localhost     at 
> javax.security.auth.Subject.doAs(Subject.java:415)
> 2016-12-26T14:11:23.573957+01:00 localhost     at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
> 2016-12-26T14:11:23.573985+01:00 localhost     at 
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:570)
> 2016-12-26T14:11:23.574014+01:00 localhost     at 
> org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:561)
> 2016-12-26T14:11:23.574047+01:00 localhost     at 
> org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:1183)
> 2016-12-26T14:11:23.574076+01:00 localhost     at 
> org.apache.oozie.action.hadoop.JavaActionExecutor.start(JavaActionExecutor.java:1369)
> 2016-12-26T14:11:23.574106+01:00 localhost     at 
> org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:234)
> 2016-12-26T14:11:23.574162+01:00 localhost     at 
> org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:65)
> 2016-12-26T14:11:23.574193+01:00 localhost     at 
> org.apache.oozie.command.XCommand.call(XCommand.java:287)
> 2016-12-26T14:11:23.574222+01:00 localhost     at 
> java.util.concurrent.FutureTask.run(FutureTask.java:262)
> 2016-12-26T14:11:23.574252+01:00 localhost     at 
> org.apache.oozie.service.CallableQueueService$CallableWrapper.run(CallableQueueService.java:179)
> 2016-12-26T14:11:23.574286+01:00 localhost     at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 2016-12-26T14:11:23.574341+01:00 localhost     at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 2016-12-26T14:11:23.574372+01:00 localhost     at 
> java.lang.Thread.run(Thread.java:745)
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to