I got it.  The hadoop installation had been done by root (I can't claim credit 
for that thankfully), and when I chowned everything to my account, I missed a 
few directories.  Filling in those blanks made it start working.

On Jul 27, 2012, at 11:30 , anil gupta wrote:

> Hi Keith,
> 
> Does ping to localhost returns a reply? Try telneting to localhost 9000.
> 
> Thanks,
> Anil
> 
> On Fri, Jul 27, 2012 at 11:22 AM, Keith Wiley <kwi...@keithwiley.com> wrote:
> 
>> I'm plagued with this error:
>> Retrying connect to server: localhost/127.0.0.1:9000.
>> 
>> I'm trying to set up hadoop on a new machine, just a basic
>> pseudo-distributed setup.  I've done this quite a few times on other
>> machines, but this time I'm kinda stuck.  I formatted the namenode without
>> obvious errors and ran start-all.sh with no errors to stdout.  However, the
>> logs are full of that error above and if I attempt to access hdfs (ala
>> "hadoop fs -ls /") I get that error again.  Obviously, my core-site.xml
>> sets fs.default.name to "hdfs://localhost:9000".
>> 
>> I assume something is wrong with /etc/hosts, but I'm not sure how to fix
>> it.  If "hostname" returns X and "hostname -f" returns Y, then what are the
>> corresponding entries in /etc/hosts?
>> 
>> Thanks for any help.
>> 


________________________________________________________________________________
Keith Wiley     kwi...@keithwiley.com     keithwiley.com    music.keithwiley.com

"What I primarily learned in grad school is how much I *don't* know.
Consequently, I left grad school with a higher ignorance to knowledge ratio than
when I entered."
                                           --  Keith Wiley
________________________________________________________________________________

Reply via email to