The job configuration didn't have a mapred.cache.files property, so I added
it to mapred-site.xml with the location of the hive jars in the share/lib
directory, restarted the affected components and resubmitted the job...
same result. Checked the job config again, and no mapred.cache.files
property still.

Putting the hive jars into the workflow directory didn't work, either
(unless I did something wrong, which is always a possibility...)


On 28 August 2014 20:48, Charles Robertson <[email protected]>
wrote:

> Hi Mohammad,
>
> I'm using Hadoop 2.4.0 (Hortonworks Data Platform 2.1). I'm not sure I
> found what you wanted, but I did find the attached log file, which included
> the classpath and guess what? No hive JARs on them.
>
> I tried running the hive script from the console using sudo to impersonate
> oozie and the non-root user I also tried. Both had the required references
> to run the hive script.
>
> Thanks for your help,
> Charles
>
>
> On 28 August 2014 04:18, Mohammad Islam <[email protected]>
> wrote:
>
>> Is it Hadoop 2.x or Hadoop 1.x?
>>
>> Can you please go to the launcher hadoop job and click on the "Job
>> File:" link and check if the property (mapred.cache.files)  included anyone
>> of the jars (hive-exec*.jar and hive-common*.jar)?
>>
>>
>> Alternatively, please copy all hive jars from share/lib/hive/ directory
>> into your workflow/lib directory. You will need to remove the old workflow
>> app from the HDFS directory and upload the new with hive jars included.
>>  Then resubmit the job. Btw this is a dirty workaround.
>>
>> Regards,
>> Mohammad
>>
>>
>>
>>
>>
>> On Wednesday, August 27, 2014 9:20 AM, Charles Robertson <
>> [email protected]> wrote:
>>
>>
>>
>> I have also tried running the job again using the Shell action to run the
>> hive script, I've tried doing this as another user (using sudo; and making
>> sure the share lib folder is copied to that user's HDFS folder). The -doas
>> flag doesn't work for some reason - I get a 500 internal server error.
>>
>> In either case I still get the same errors - JA018, although for the Shell
>> action the log looks like this:
>>
>> 140827143536335-oozie-oozi-W@ScoreData] Start action
>> [0000003-140827143536335-oozie-oozi-W@ScoreData] with user-retry state :
>> userRetryCount [0], userRetryMax [0], userRetryInterval [10]
>> 2014-08-27 15:16:30,354  WARN ShellActionExecutor:542 - USER[cxr] GROUP[-]
>> TOKEN[] APP[HustingsJe-v1] JOB[0000003-140827143536335-oozie-oozi-W]
>> ACTION[0000003-140827143536335-oozie-oozi-W@ScoreData] credentials is
>> null
>> for the action
>> 2014-08-27 15:16:31,110  INFO ShellActionExecutor:539 - USER[cxr] GROUP[-]
>> TOKEN[] APP[HustingsJe-v1] JOB[0000003-140827143536335-oozie-oozi-W]
>> ACTION[0000003-140827143536335-oozie-oozi-W@ScoreData] Trying to get job
>> [job_1409150341917_0007], attempt [1]
>> 2014-08-27 15:16:31,127  INFO ShellActionExecutor:539 - USER[cxr] GROUP[-]
>> TOKEN[] APP[HustingsJe-v1] JOB[0000003-140827143536335-oozie-oozi-W]
>> ACTION[0000003-140827143536335-oozie-oozi-W@ScoreData] checking action,
>> external ID [job_1409150341917_0007] status [RUNNING]
>> 2014-08-27 15:16:31,129  WARN ActionStartXCommand:542 - USER[cxr] GROUP[-]
>> TOKEN[] APP[HustingsJe-v1] JOB[0000003-140827143536335-oozie-oozi-W]
>> ACTION[0000003-140827143536335-oozie-oozi-W@ScoreData]
>> [***0000003-140827143536335-oozie-oozi-W@ScoreData***]Action
>> status=RUNNING
>> 2014-08-27 15:16:31,129  WARN ActionStartXCommand:542 - USER[cxr] GROUP[-]
>> TOKEN[] APP[HustingsJe-v1] JOB[0000003-140827143536335-oozie-oozi-W]
>> ACTION[0000003-140827143536335-oozie-oozi-W@ScoreData]
>> [***0000003-140827143536335-oozie-oozi-W@ScoreData***]Action updated in
>> DB!
>> 2014-08-27 15:17:09,865  INFO CallbackServlet:539 - USER[-] GROUP[-]
>> TOKEN[-] APP[-] JOB[0000003-140827143536335-oozie-oozi-W]
>> ACTION[0000003-140827143536335-oozie-oozi-W@ScoreData] callback for
>> action
>> [0000003-140827143536335-oozie-oozi-W@ScoreData]
>> 2014-08-27 15:17:10,009  INFO ShellActionExecutor:539 - USER[cxr] GROUP[-]
>> TOKEN[] APP[HustingsJe-v1] JOB[0000003-140827143536335-oozie-oozi-W]
>> ACTION[0000003-140827143536335-oozie-oozi-W@ScoreData] Trying to get job
>> [job_1409150341917_0007], attempt [1]
>> 2014-08-27 15:17:10,114  INFO ShellActionExecutor:539 - USER[cxr] GROUP[-]
>> TOKEN[] APP[HustingsJe-v1] JOB[0000003-140827143536335-oozie-oozi-W]
>> ACTION[0000003-140827143536335-oozie-oozi-W@ScoreData] action completed,
>> external ID [null]
>> 2014-08-27 15:17:10,121  WARN ShellActionExecutor:542 - USER[cxr] GROUP[-]
>> TOKEN[] APP[HustingsJe-v1] JOB[0000003-140827143536335-oozie-oozi-W]
>> ACTION[0000003-140827143536335-oozie-oozi-W@ScoreData] Launcher ERROR,
>> reason: Main class [org.apache.oozie.action.hadoop.ShellMain], exit code
>> [1]
>> 2014-08-27 15:17:10,150  INFO ActionEndXCommand:539 - USER[cxr] GROUP[-]
>> TOKEN[] APP[HustingsJe-v1] JOB[0000003-140827143536335-oozie-oozi-W]
>> ACTION[0000003-140827143536335-oozie-oozi-W@ScoreData] end executor for
>> wf
>> action 0000003-140827143536335-oozie-oozi-W with wf job
>> 0000003-140827143536335-oozie-oozi-W
>> 2014-08-27 15:17:10,189  INFO ActionEndXCommand:539 - USER[cxr] GROUP[-]
>> TOKEN[] APP[HustingsJe-v1] JOB[0000003-140827143536335-oozie-oozi-W]
>> ACTION[0000003-140827143536335-oozie-oozi-W@ScoreData] ERROR is
>> considered
>> as FAILED for SLA
>>
>>
>>
>> On 27 August 2014 12:07, Charles Robertson <[email protected]>
>> wrote:
>>
>> > Hi Mohammad,
>> >
>> > I have copied the whole /user/oozie/share folder to /user/root/share.
>> HDFS
>> > permissions are turned off (dfs.permissions.enabled = false). I have
>> > rerun the job (using the Hive action, not the Shell action) and this is
>> > from the job log:
>> >
>> > 2014-08-27 11:00:33,349  INFO HiveActionExecutor:539 - USER[root]
>> GROUP[-]
>> > TOKEN[] APP[HustingsJe-v1] JOB[0000002-140827104656981-oozie-oozi-W]
>> > ACTION[0000002-140827104656981-oozie-oozi-W@ScoreData] Trying to get
>> job
>> > [job_1409136406325_0007], attempt [1]
>> > 2014-08-27 11:00:33,422  INFO HiveActionExecutor:539 - USER[root]
>> GROUP[-]
>> > TOKEN[] APP[HustingsJe-v1] JOB[0000002-140827104656981-oozie-oozi-W]
>> > ACTION[0000002-140827104656981-oozie-oozi-W@ScoreData] action
>> completed,
>> > external ID [null]
>> > 2014-08-27 11:00:33,431  WARN HiveActionExecutor:542 - USER[root]
>> GROUP[-]
>> > TOKEN[] APP[HustingsJe-v1] JOB[0000002-140827104656981-oozie-oozi-W]
>> > ACTION[0000002-140827104656981-oozie-oozi-W@ScoreData] Launcher ERROR,
>> > reason: Main class [org.apache.oozie.action.hadoop.HiveMain], main()
>> threw
>> > exception, org/apache/hadoop/hive/conf/HiveConf
>> > 2014-08-27 11:00:33,431  WARN HiveActionExecutor:542 - USER[root]
>> GROUP[-]
>> > TOKEN[] APP[HustingsJe-v1] JOB[0000002-140827104656981-oozie-oozi-W]
>> > ACTION[0000002-140827104656981-oozie-oozi-W@ScoreData] Launcher
>> > exception: org/apache/hadoop/hive/conf/HiveConf
>> > java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
>> > at
>> org.apache.oozie.action.hadoop.HiveMain.setUpHiveSite(HiveMain.java:182)
>> > at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:196)
>> >  at
>> org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>> > at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > at
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> >  at
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> > at java.lang.reflect.Method.invoke(Method.java:606)
>> >  at
>> >
>> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>> > at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>> >  at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>> > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>> >  at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>> > at java.security.AccessController.doPrivileged(Native Method)
>> >  at javax.security.auth.Subject.doAs(Subject.java:415)
>> > at
>> >
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>> >  at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>> > Caused by: java.lang.ClassNotFoundException:
>> > org.apache.hadoop.hive.conf.HiveConf
>> > at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>> >  at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>> > at java.security.AccessController.doPrivileged(Native Method)
>> >  at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>> > at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>> >  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>> > at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>> >  ... 17 more
>> >
>> > 2014-08-27 11:00:33,450  INFO ActionEndXCommand:539 - USER[root]
>> GROUP[-]
>> > TOKEN[] APP[HustingsJe-v1] JOB[0000002-140827104656981-oozie-oozi-W]
>> > ACTION[0000002-140827104656981-oozie-oozi-W@ScoreData] end executor for
>> > wf action 0000002-140827104656981-oozie-oozi-W with wf job
>> > 0000002-140827104656981-oozie-oozi-W
>> > 2014-08-27 11:00:33,483  INFO ActionEndXCommand:539 - USER[root]
>> GROUP[-]
>> > TOKEN[] APP[HustingsJe-v1] JOB[0000002-140827104656981-oozie-oozi-W]
>> > ACTION[0000002-140827104656981-oozie-oozi-W@ScoreData] ERROR is
>> > considered as FAILED for SLA
>> >
>> > (I know running jobs as root and having permissions turned off is not
>> good
>> > practice, but this is mainly a learning exercise for me, not a
>> production
>> > system.)
>> >
>> > Thanks for your help,
>> > Charles
>> >
>> >
>> > On 27 August 2014 09:14, Mohammad Islam <[email protected]>
>> > wrote:
>> >
>> >> In general, running ooze as root is not a good practice.
>> >>
>> >> If you use 'root', the whole share lib should be in
>> /user/root/share/lib/
>> >> directory with proper permission for user 'root'.
>> >>
>> >> After that, can you please run shell action  and share the error?
>> >>
>> >> Regards,
>> >> Mohammad
>> >>
>> >>
>> >> On Wednesday, August 27, 2014 12:55 AM, Charles Robertson <
>> >> [email protected]> wrote:
>> >>
>> >>
>> >>
>> >> I believe so - the install process put a large number of JARs in HDFS
>> >> under
>> >> /user/oozie/share/lib and I copied the Hive subfolder to
>> >> /user/root/share/lib just in case. (The workflow is running as root).
>> >>
>> >> Regards,
>> >> Charles
>> >>
>> >>
>> >>
>> >> On 26 August 2014 21:40, Mohammad Islam <[email protected]>
>> >> wrote:
>> >>
>> >> > Did you upload the 'sharelib' correctly?
>> >> >
>> >> >
>> >> >
>> >> > On Tuesday, August 26, 2014 8:06 AM, Charles Robertson <
>> >> > [email protected]> wrote:
>> >> >
>> >> >
>> >> >
>> >> > Update: I tried switching the Hive action to a Shell action and
>> calling
>> >> the
>> >> > Hive command. The log was less informative, but I still received
>> error
>> >> code
>> >> > JA018 with error message 'Main class
>> >> > [org.apache.oozie.action.hadoop.ShellMain], exit code [1]'.
>> >> >
>> >> > This smells to me like it can't find the executables when running the
>> >> job.
>> >> > The log shows that 'User' is root (I have submitted and run the job
>> >> from an
>> >> > SSH console logged in as root) which should have access to the JARs -
>> >> > although I can run the Hive command from the console when logged in
>> as
>> >> > root.
>> >> >
>> >> > Perhaps oozie is impersonating root, and not getting the same
>> classpath
>> >> or
>> >> > environment variables or somesuch?
>> >> >
>> >> > Regards,
>> >> > Charles
>> >> >
>> >> >
>> >> >
>> >> >
>> >> >
>> >> > On 26 August 2014 13:33, Charles Robertson <
>> [email protected]
>> >> >
>> >> > wrote:
>> >> >
>> >> > > Hi all,
>> >> > >
>> >> > > My Hive action is failing with the following error:
>> >> > >
>> >> > > Launcher exception: org/apache/hadoop/hive/conf/HiveConf
>> >> > > java.lang.NoClassDefFoundError:
>> org/apache/hadoop/hive/conf/HiveConf
>> >> > >     at
>> >> > >
>> >>
>> org.apache.oozie.action.hadoop.HiveMain.setUpHiveSite(HiveMain.java:182)
>> >> > >     at
>> org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:196)
>> >> > >     at
>> >> > >
>> org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>> >> > >     at
>> org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>> >> > >     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >> > >     at
>> >> > >
>> >> >
>> >>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> >> > >     at
>> >> > >
>> >> >
>> >>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> >> > >     at java.lang.reflect.Method.invoke(Method.java:606)
>> >> > >     at
>> >> > >
>> >> >
>> >>
>> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>> >> > >     at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>> >> > >     at
>> org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>> >> > >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>> >> > >     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>> >> > >     at java.security.AccessController.doPrivileged(Native Method)
>> >> > >     at javax.security.auth.Subject.doAs(Subject.java:415)
>> >> > >     at
>> >> > >
>> >> >
>> >>
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>> >> > >     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>> >> > > Caused by: java.lang.ClassNotFoundException:
>> >> > > org.apache.hadoop.hive.conf.HiveConf
>> >> > >     at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>> >> > >     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>> >> > >     at java.security.AccessController.doPrivileged(Native Method)
>> >> > >     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>> >> > >     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>> >> > >     at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>> >> > >     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>> >> > >     ... 17 more
>> >> > >
>> >> > > Error Code: JA018
>> >> > > Error Message: org/apache/hadoop/hive/conf/HiveConf
>> >> > >
>> >> > > This is my hive action:
>> >> > > <hive xmlns="uri:oozie:hive-action:0.2">
>> >> > >   <job-tracker>[private DNS name]:8050</job-tracker>
>> >> > >   <name-node>hdfs://[private DNS name]:8020</name-node>
>> >> > >   <script>score_tweets.sql</script>
>> >> > > </hive>
>> >> > >
>> >> > > job.properties:
>> >> > > nameNode=hdfs://[private DNS name]:8020
>> >> > > jobTracker=[private DNS name]:8050
>> >> > > queueName=default
>> >> > > oozie.libpath=${nameNode}/user/oozie/share/lib
>> >> > > oozie.wf.application.path=${nameNode}/user/root/oozie
>> >> > >
>> >> > > I am using oozie 4.0.0 on HDP 2.1
>> >> > >
>> >> > > Googling produced some useful things (see
>> >> > >
>> >> >
>> >>
>> https://support.pivotal.io/hc/en-us/articles/202563453-Oozie-hive-action-fails-with-java-lang-NoClassDefFoundError-org-apache-hadoop-hive-conf-HiveConf-
>> >> > > and
>> >> > >
>> >> >
>> >>
>> http://stackoverflow.com/questions/18369605/error-while-running-hive-action-in-oozie
>> >> > ),
>> >> > > but I've tried these, and I'm still getting the same error.
>> >> > >
>> >> > > I have:
>> >> > > - ensured that hive-exec is in /user/oozie/share/lib (also in
>> >> > > /user/root/share/lib as this job is run as root)
>> >> > > - ensured that hive-exec exists in a path in the common.loader
>> >> property
>> >> > in
>> >> > > the catalina.properties file
>> >> > > - even put the 'oozie.service.WorkflowAppService.system.libpath'
>> >> property
>> >> > > into hive-site.xml like the stackoverflow answer suggested, even
>> >> though
>> >> > > this doesn't make much sense to me (it is in oozie-site.xml and I
>> have
>> >> > > overriden it to point hard-coded to /user/oozie/share/lib)
>> >> > > - not done anything with hive-default.xml since the documentation
>> says
>> >> > > this is now ignored.
>> >> > >
>> >> > > Does anyone have any suggestions?
>> >> > >
>> >> > > Thanks,
>> >> > > Charles
>> >> > >
>> >> >
>> >>
>> >
>> >
>>
>
>

Reply via email to