Looks like the client machine from which you call -put cannot connect to the
data-nodes.
It could be firewall or wrong configuration parameters that you use for the
client.
Alexander Arimond wrote:
hi,
i'm new in hadoop and im just testing it at the moment.
i set up a cluster with 2 nodes and it seems like they are running
normally,
the log files of the namenode and the datanodes dont show errors.
Firewall should be set right.
but when i try to upload a file to the dfs i get following message:
[EMAIL PROTECTED]:~/hadoop$ bin/hadoop dfs -put file.txt file.txt
08/06/12 14:44:19 INFO dfs.DFSClient: Exception in
createBlockOutputStream java.net.ConnectException: Connection refused
08/06/12 14:44:19 INFO dfs.DFSClient: Abandoning block
blk_5837981856060447217
08/06/12 14:44:28 INFO dfs.DFSClient: Exception in
createBlockOutputStream java.net.ConnectException: Connection refused
08/06/12 14:44:28 INFO dfs.DFSClient: Abandoning block
blk_2573458924311304120
08/06/12 14:44:37 INFO dfs.DFSClient: Exception in
createBlockOutputStream java.net.ConnectException: Connection refused
08/06/12 14:44:37 INFO dfs.DFSClient: Abandoning block
blk_1207459436305221119
08/06/12 14:44:46 INFO dfs.DFSClient: Exception in
createBlockOutputStream java.net.ConnectException: Connection refused
08/06/12 14:44:46 INFO dfs.DFSClient: Abandoning block
blk_-8263828216969765661
08/06/12 14:44:52 WARN dfs.DFSClient: DataStreamer Exception:
java.io.IOException: Unable to create new block.
08/06/12 14:44:52 WARN dfs.DFSClient: Error Recovery for block
blk_-8263828216969765661 bad datanode[0]
dont know what that means and didnt found something about that..
Hope somebody can help with that.
Thank you!