[ 
https://issues.apache.org/jira/browse/OOZIE-3313?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16566695#comment-16566695
 ] 

Hadoop QA commented on OOZIE-3313:
----------------------------------

PreCommit-OOZIE-Build started


> Hive example action fails
> -------------------------
>
>                 Key: OOZIE-3313
>                 URL: https://issues.apache.org/jira/browse/OOZIE-3313
>             Project: Oozie
>          Issue Type: Bug
>          Components: action, workflow
>    Affects Versions: trunk, 5.0.0, 5.1.0
>            Reporter: Daniel Becker
>            Assignee: Daniel Becker
>            Priority: Minor
>         Attachments: OOZIE-3313.patch
>
>
> The patch for OOZIE-2619 introduced the following property in 
> core/src/main/conf/action-conf/hive.xml:
> {code:java}
> <property>
> <name>mapreduce.input.fileinputformat.split.maxsize</name>
> <value>256000000L</value>
> </property>{code}
> This makes the hive example fail because the value "256000000L" cannot be 
> parsed as a Long because of the final 'L' (as a literal in the source code, 
> the L suffix is ok but not when trying to parse it from a string). The action 
> fails with the following stack trace:
> {noformat}
> java.lang.NumberFormatException: For input string: "256000000L"
>       at 
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>       at java.lang.Long.parseLong(Long.java:589)
>       at java.lang.Long.parseLong(Long.java:631)
>       at org.apache.hadoop.conf.Configuration.getLong(Configuration.java:1267)
>       at 
> org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getSplits(HadoopShimsSecure.java:305)
>       at 
> org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getCombineSplits(CombineHiveInputFormat.java:470)
>       at 
> org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getSplits(CombineHiveInputFormat.java:571)
>       at 
> org.apache.hadoop.mapreduce.JobSubmitter.writeOldSplits(JobSubmitter.java:329)
>       at 
> org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:321)
>       at 
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:197)
>       at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1297)
>       at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1294)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:422)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
>       at org.apache.hadoop.mapreduce.Job.submit(Job.java:1294)
>       at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
>       at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:422)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
>       at 
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
>       at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
>       at 
> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:431)
>       at 
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137)
>       at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
>       at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
>       at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1650)
>       at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1409)
>       at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1192)
>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
>       at 
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)
>       at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
>       at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
>       at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:311)
>       at 
> org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:409)
>       at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:425)
>       at 
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:714)
>       at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
>       at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
>       at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:310)
>       at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:294)
>       at 
> org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:101)
>       at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:69)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at 
> org.apache.oozie.action.hadoop.LauncherAM.runActionMain(LauncherAM.java:410)
>       at 
> org.apache.oozie.action.hadoop.LauncherAM.access$300(LauncherAM.java:55)
>       at org.apache.oozie.action.hadoop.LauncherAM$2.run(LauncherAM.java:223)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:422)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
>       at org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:217)
>       at org.apache.oozie.action.hadoop.LauncherAM$1.run(LauncherAM.java:153)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:422)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
>       at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:141)
> Job Submission failed with exception 'java.lang.NumberFormatException(For 
> input string: "256000000L")'
> FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask
> {noformat} 
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to