Hi, We have been trying to increase the heap used by Hadoop when running a single, small job. It complains that Java heap and GC memory has expired. Its a 64bit Fedora 14 with 8GB memory.
- Memory is not exhausted - ulimits on files, descriptors, etc. are not exhausted. What could be causing this? thanks for any tips Darren
