You don't "need" to specify a path. If you don't specify a path argument for
ls, then it uses your home directory in HDFS ("/user/<yourusernamehere>").
When you first started HDFS, /user/hadoop didn't exist, so 'hadoop fs -ls'
--> 'hadoop fs -ls /user/hadoop' --> directory not found. When you mkdir'd
'lol', you were actually effectively doing "mkdir -p /user/hadoop/lol", so
then it created your home directory underneath of that.- Aaron On Tue, Nov 10, 2009 at 1:30 PM, zenkalia <[email protected]> wrote: > ok, things are working.. i must have forgotten what i did when first > setting up hadoop... > > should these responses be considered inconsistent/an error? hmm. > > hadoop dfs -ls > error > hadoop dfs -ls / > irrelevant stuff about the path you're in > hadoop dfs -mkdir lol > works fine > hadoop dfs -ls > Found 1 items > drwxr-xr-x - hadoop supergroup 0 2009-11-10 05:28 > /user/hadoop/lol > > thanks stephen. > -mike > > On Tue, Nov 10, 2009 at 1:19 PM, Stephen Watt <[email protected]> wrote: > > > You need to specify a path. Try "bin/hadoop dfs -ls / " > > > > Steve Watt > > > > > > > > From: > > zenkalia <[email protected]> > > To: > > [email protected] > > Date: > > 11/10/2009 03:04 PM > > Subject: > > error setting up hdfs? > > > > > > > > had...@hadoop1:/usr/local/hadoop$ bin/hadoop dfs -ls > > ls: Cannot access .: No such file or directory. > > > > anyone else get this one? i started changing settings on my box to get > > all > > of my cores working, but immediately hit this error. since then i > started > > from scratch and have hit this error again. what am i missing? > > > > > > >
