on? or should I just ignore it.
I'm not sure, But I guess there is no problem.
Does anyone have some experience with that?
Regards, Edward J. Yoon
On Wed, Jul 23, 2008 at 11:05 PM, Jose Vidal [EMAIL PROTECTED] wrote:
Thanks! that worked. I was able to run dfs and put some files
, Edward J. Yoon [EMAIL PROTECTED] wrote:
So, do I need to change the host file in all the slaves, or just the
namenode?
Just the namenode.
Thanks, Edward
On Wed, Jul 23, 2008 at 7:45 AM, Jose Vidal [EMAIL PROTECTED] wrote:
Yes, the host file just has:
127.0.0.1 localhost hermes.cse.sc.edu
I'm trying to install hadoop on our linux machine but after
start-all.sh none of the slaves can connect:
2008-07-22 16:35:27,534 INFO org.apache.hadoop.dfs.DataNode: STARTUP_MSG:
/
STARTUP_MSG: Starting DataNode
STARTUP_MSG: host =