which version of hadoop are you using?. I faced a similar exception with Rhipe-0.68 in CDH4. The reason was because that version of Rhipe doesnt support CDH4. My problem got resolved with the release of Rhipe-0.73. So check whether it supports your hadoop version.
On Wed, May 15, 2013 at 7:42 PM, rohit sarewar <[email protected]>wrote: > Hi > > I am running a rhive job on Hadoop cluster on 40 mb data. > It throws an error " java.lang.OutOfMemoryError: Java heap space" as > shown below. > *How do I set java heap space inside my Rhive job ?* > > > [mycompany@arjun10 Desktop]$ Rscript muFreqBig.R > Loading required package: rJava > Loading required package: methods > Loading required package: Rserve > This is RHive 0.0-7. For overview type ‘?RHive’. > HIVE_HOME=/usr/lib/hive > call rhive.init() because HIVE_HOME is set. > 13/05/15 05:51:37 WARN conf.Configuration: fs.default.name is deprecated. > Instead, use fs.defaultFS > 13/05/15 05:51:37 WARN conf.Configuration: fs.default.name is deprecated. > Instead, use fs.defaultFS > 13/05/15 05:51:37 WARN util.NativeCodeLoader: Unable to load native-hadoop > library for your platform... using builtin-java classes where applicable > Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl, : > java.lang.OutOfMemoryError: Java heap space > Calls: muFreqBig ... <Anonymous> -> .jrcall -> .jcall -> .jcheck -> .Call > Execution halted > > Thanks > Rohit Sarewar >
