[
https://issues.apache.org/jira/browse/MAPREDUCE-796?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12736094#action_12736094
]
Suman Sehgal commented on MAPREDUCE-796:
----------------------------------------
Observations taken with different set of config parameters for the above
mentioned scenario are as follows:
Observation 1:
=============
Heap Size - (mapred.child.java.opts) --> 640MB
Compression code - LzoCodec
Native lib enabled
Result --> ClassCastException with OutOfMemoryError
Observation 2:
=============
Heap Size - (mapred.child.java.opts) --> 768MB
Compression code - LzoCodec
Native lib enabled
Result --> No Exception/Error
Observation 3:
=============
Heap Size - (mapred.child.java.opts) --> 640MB
Compression code - DefaultCodec
Native lib disabled
Result --> No Exception/Error
> Encountered "ClassCastException" on tasktracker while running wordcount with
> MultithreadedMapRunner
> ---------------------------------------------------------------------------------------------------
>
> Key: MAPREDUCE-796
> URL: https://issues.apache.org/jira/browse/MAPREDUCE-796
> Project: Hadoop Map/Reduce
> Issue Type: Bug
> Components: examples
> Affects Versions: 0.20.1
> Reporter: Suman Sehgal
>
> ClassCastException for OutOfMemoryError is encountered on tasktracker while
> running wordcount example with MultithreadedMapRunner.
> Stack trace :
> =========
> java.lang.ClassCastException: java.lang.OutOfMemoryError cannot be cast to
> java.lang.RuntimeException
> at
> org.apache.hadoop.mapreduce.lib.map.MultithreadedMapper.run(MultithreadedMapper.java:149)
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:581)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:303)
> at org.apache.hadoop.mapred.Child.main(Child.java:170)
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.