NegativeArraySizeException in reducer with new api
--------------------------------------------------
Key: HADOOP-5907
URL: https://issues.apache.org/jira/browse/HADOOP-5907
Project: Hadoop Core
Issue Type: Bug
Components: mapred
Affects Versions: 0.20.0
Reporter: Amareshwari Sriramadasu
Fix For: 0.21.0
I observed one of the reducers failing with NegativeArraySizeException with new
api.
The exception trace:
java.lang.NegativeArraySizeException
at
org.apache.hadoop.io.BytesWritable.setCapacity(BytesWritable.java:119)
at org.apache.hadoop.io.BytesWritable.setSize(BytesWritable.java:98)
at org.apache.hadoop.io.BytesWritable.readFields(BytesWritable.java:153)
at
org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:67)
at
org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:40)
at
org.apache.hadoop.mapreduce.ReduceContext.nextKeyValue(ReduceContext.java:142)
at
org.apache.hadoop.mapreduce.ReduceContext.nextKey(ReduceContext.java:121)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:189)
at
org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:542)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:409)
at org.apache.hadoop.mapred.Child.main(Child.java:159)
The corresponding line in ReduceContext is
{code}
line#142 key = keyDeserializer.deserialize(key);
{code}
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.