In the namenode log, it does show it has lost heartbeat to the datanode. But
data node log looks fine, no errors there.
Any ideas?
thanks
-John
On 11/28/06, Raghu Angadi <[EMAIL PROTECTED]> wrote:
John Wang wrote:
> And in the data node log, I see the line:
> 2006-11-24 10:05:07,261 INFO
John Wang wrote:
And in the data node log, I see the line:
2006-11-24 10:05:07,261 INFO org.apache.hadoop.dfs.DataNode: Opened server
at 50
010
So looks like the data node is alive.
Also, by clocking on the Browse the filesystem link in the admin ui, I
am taken to the address:
http://19
I started HDFS by running start-dfs.sh (also tried start-all.sh, but makes
no difference).
I was able to create directories by running: haddop dfs -mkdir
and using the hadoop java api, e.g. getting a FileSystem instance and do:
fs.mkdirs().
While trying to do FileSystem.createNewFile, I get: "W
On 11/24/06, John Wang <[EMAIL PROTECTED]> wrote:
Hi:
After startind HDFS, I am able to create directories via the Hadoop api
and the shell app.
However I am not able to create a new file: I keep on getting problems
connecting to the data node. (on locahost:50010)
By going to the admi
Hi:
After startind HDFS, I am able to create directories via the Hadoop api
and the shell app.
However I am not able to create a new file: I keep on getting problems
connecting to the data node. (on locahost:50010)
By going to the admin UI, I see Live Datanodes listed to be correct to be