While run mapred, I received below error. The size of RowResult seems
too large. What do you think?
----
08/11/27 13:42:49 INFO mapred.JobClient: map 0% reduce 0%
08/11/27 13:42:55 INFO mapred.JobClient: map 50% reduce 0%
08/11/27 13:43:09 INFO mapred.JobClient: map 50% reduce 8%
08/11/27 13:43:13 INFO mapred.JobClient: map 50% reduce 16%
08/11/27 13:43:15 INFO mapred.JobClient: Task Id :
attempt_200811271320_0006_m_000000_0, Status : FAILED
java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2786)
at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:94)
at java.io.DataOutputStream.write(DataOutputStream.java:90)
at org.apache.hadoop.hbase.util.Bytes.writeByteArray(Bytes.java:65)
at org.apache.hadoop.hbase.io.Cell.write(Cell.java:152)
at
org.apache.hadoop.hbase.io.HbaseMapWritable.write(HbaseMapWritable.java:196)
at org.apache.hadoop.hbase.io.RowResult.write(RowResult.java:245)
at org.apache.hadoop.hbase.util.Writables.getBytes(Writables.java:49)
at
org.apache.hadoop.hbase.util.Writables.copyWritable(Writables.java:134)
--
Best Regards, Edward J. Yoon @ NHN, corp.
[EMAIL PROTECTED]
http://blog.udanax.org