Hi all,

I execute my project from inside eclipse. Maybe this is the problem and
this value in mapred-site.xml didn't recognised?

Basically I have a file with the following code in order to run my project
through eclipse.
public int run(String[] arg0) throws Exception{
..
    giraphConf.setWorkerConfiguration(0, 1, 100.0f); //3rd
arg=minPercentResponded
    giraphConf.setLocalTestMode(true);
    giraphConf.USE_SUPERSTEP_COUNTERS.set(giraphConf, false);


    giraphConf.setMaxNumberOfSupersteps(4000);
    giraphConf.LOG_LEVEL.set(giraphConf, "error");
    giraphConf.SPLIT_MASTER_WORKER.set(giraphConf, false);
    giraphConf.USE_OUT_OF_CORE_GRAPH.set(giraphConf, true);

    GiraphJob job= new GiraphJob(giraphConf,getClass().getName());
    //Set the Output path
   FileOutputFormat.setOutputPath(job.getInternalJob(), new
Path(getOutputPath()));

    job.run(true);

    return 1;
}
   public static void main(String[] args) throws Exception{
       ToolRunner.run(new GiraphAppRunner(), args);
   }




Best,
Xenia

2016-11-05 8:46 GMT+02:00 Panagiotis Liakos <p.lia...@di.uoa.gr>:

> Hi all,
>
> This property in hadoop/conf/mapred-site.xml works for me:
>
> <property>
> <name>mapred.map.child.java.opts</name>
> <value>-Xmx10g</value>
> </property>
>
> Regards,
> Panagiotis
>
> 2016-11-04 23:11 GMT+02:00 Xenia Demetriou <xenia...@gmail.com>:
> > Hi,
> > I have the same problem and I add the following  in
> > mapred-site.xml and hadoop-env.sh but I still have the same problem.
> > I try various values below but nothhing increase the memory.
> >
> > mapred-site.xml:
> > <property>
> >     <name>mapred.child.java.opts</name>
> >     <value>-Xms256m </value>
> >     <value>-Xmx4096m</value>
> > </property>
> >
> > hadoop-env.sh:
> > export HADOOP_HEAPSIZE=3072
> > export HADOOP_OPTS="-Xmx4096m"
> >
> >
>

Reply via email to