"stand alone mode" does not use any daemons - it's just a way of running
mapreduce in a single process for quick testing and getting started. You
need "pseudo distributed mode" if you want to run the daemons.

-Todd

On Thu, Nov 5, 2009 at 12:06 AM, Mohan Agarwal <mohan.agarwa...@gmail.com>wrote:

> Hi ,
>    I have forget to  mention that , I have installed *hadoop-0.20* in
> *stand
> alone mode *on my system.
>
> Thanking You
> Mohan Agarwal
>
> On Thu, Nov 5, 2009 at 1:19 PM, Mohan Agarwal <mohan.agarwa...@gmail.com
> >wrote:
>
> > Hi,
> >     I have installed *hadoop-0.20* on my system. I am facing a problem
> > while starting hadoop using *start-all.sh* command.
> >     I am getting the following error :
> >
> > *localhost: starting secondarynamenode, logging to
> >
> /usr/lib/hadoop/bin/../logs/hadoop-root-secondarynamenode-magarwal.in.connectivasystems.com.out
> > *
> > localhost: Exception in thread "main" java.lang.NullPointerException
> > localhost:      at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:134)
> > localhost:      at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:161)
> > localhost:      at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:165)
> > localhost:      at
> >
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:131)
> > localhost:      at
> >
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:115)
> > localhost:      at
> >
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:476)
> >
> > Above error suggest that *Secondary Name Node* is not getting
> started.*Although other things is working fine
> > *, I am also using hive (with Derby Network Server as a Metadata Store ).
> > From Hive CLI , hive query is getting executed.
> >
> > Please help me to solve this problem.
> >
> > Thanking You
> > Mohan Agarwal
> >
> >
>

Reply via email to