hi, David Thank you for your reply. It really helps. Now, I've solved this problem following the link you sent me.
Best, Huanchen On Apr 16, 2012, at 4:35 PM, David Alves wrote: > Hi Huachen > > You'r problem is that hadoop is trying to read something into a long > that is too big (the number 18... is about 1 order magnitude bigger than > Long.MAX_VALUE). > This is a known hadoop problem: > https://issues.apache.org/jira/browse/MAPREDUCE-3583 > > Cheers > -david > > On Apr 16, 2012, at 9:47 PM, Huanchen Zhang wrote: > >> Hi, >> >> I'm using hadoop-0.20.2-cdh3u3 and whirr-0.7.1, and trying to run mapreduce >> jobs from local client. >> >> I got the following warns and errors: >> >> 12/04/16 13:24:59 WARN mapred.JobClient: Error reading task >> outputip-10-79-37-229.ec2.internal >> 12/04/16 13:24:59 WARN mapred.JobClient: Error reading task >> outputip-10-79-37-229.ec2.internal >> 12/04/16 13:25:04 INFO mapred.JobClient: Task Id : >> attempt_201204161919_0010_m_000001_1, Status : FAILED >> java.lang.NumberFormatException: For input string: "18446744073709551593" >> at >> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) >> at java.lang.Long.parseLong(Long.java:441) >> at java.lang.Long.parseLong(Long.java:478) >> at >> org.apache.hadoop.util.ProcfsBasedProcessTree.constructProcessInfo(ProcfsBasedProcessTree.java:413) >> at >> org.apache.hadoop.util.ProcfsBasedProcessTree.getProcessTree(ProcfsBasedProcessTree.java:148) >> at >> org.apache.hadoop.util.LinuxResourceCalculatorPlugin.getProcResourceValues(LinuxResourceCalculatorPlugin.java:401) >> at org.apache.hadoop.mapred.Task.initialize(Task.java:532) >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:306) >> at org.apache.hadoop.mapred.Child$4.run(Child.java:270) >> at java.security.AccessController.doPrivileged(Native Method) >> at javax.security.auth.Subject.doAs(Subject.java:416) >> at >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1157) >> at org.apache.hadoop.mapred.Child.main(Child.java:264) >> >> Any one has any ideas? Thank you ! >> >> Best, >> Huanchen >
