On Solaris, you may also want to change:

export HADOOP_IDENT_STRING=`/usr/xpg4/bin/id -u -n`



On 12/7/09 4:42 PM, "pavel kolodin" <[email protected]> wrote:

> 
> 2 days ago i have met the same problem. It is because java can't allocate
> memory to call it )
> After that i was playing with -Xmx options for this variables in
> hadoop-env.sh, and now they are:
> 
> export HADOOP_HEAPSIZE=1000
> export HADOOP_NAMENODE_OPTS="-Xmx612 -Dcom.sun.management.jmxremote
> $HADOOP_NAMENODE_OPTS"
> export HADOOP_SECONDARYNAMENODE_OPTS="-Xmx412
> -Dcom.sun.management.jmxremote $HADOOP_SECONDARYNAMENODE_OPTS"
> export HADOOP_DATANODE_OPTS="-Xmx412 -Dcom.sun.management.jmxremote
> $HADOOP_DATANODE_OPTS"
> export HADOOP_BALANCER_OPTS="-Xmx412 -Dcom.sun.management.jmxremote
> $HADOOP_BALANCER_OPTS"
> export HADOOP_JOBTRACKER_OPTS="-Xmx412 -Dcom.sun.management.jmxremote
> $HADOOP_JOBTRACKER_OPTS"
> export HADOOP_TASKTRACKER_OPTS="-Xmx412"
> export HADOOP_CLIENT_OPTS="-Xmx512"
> 
> and whoami called perfectly.
> 
> I have 3.5 G RAM on master, 2.0 G RAM on slave )
> 
> 
>> I am running Hadoop-0.20.1 on a Solaris box with dfs.permissions set to
>> false.
>> 
>> There is a working version of whoami on the system.
>> 
>> Folders and files created by my program show up with an owner of DrWho.
>> 
>> Folders and files created by Hbase-0.20.1 appear with the proper owner
>> name.
>> 
>> 
>> Do I need to move the whoami command someplace where map/reduce jobs can
>> find it?
>> 
>> 
>> Bill
>> 

Reply via email to