Are you sure that all the version of hadoop are the same? 2011/3/4 <[email protected]>
> Thanks Adarsh for the reply. > > Just to clarify the issue a bit, I am able to do all operations > (-copyFromLocal, -get -rmr etc) from the master node. So I am confident that > the communication between all hadoop machines is fine. But when I do the > same operation from another machine that also has same hadoop config, I get > below errors. However I can do -lsr and it lists the files correctly. > > Praveen > > -----Original Message----- > From: ext Adarsh Sharma [mailto:[email protected]] > Sent: Friday, March 04, 2011 12:12 AM > To: [email protected] > Subject: Re: Unable to use hadoop cluster on the cloud > > Hi Praveen, Check through ssh & ping that your datanodes are communicating > with each other or not. > > Cheers, Adarsh > [email protected] wrote: > > Hello all, > > I installed hadoop0.20.2 on physical machines and everything works like a > charm. Now I installed hadoop using the same hadoop-install gz file on the > cloud. Installation seems fine. I can even copy files to hdfs from master > machine. But when I try to do it from another "non hadoop" machine, I get > following error. I did googling and lot of people got this error but could > not find any solution. > > > > Also I didn't see any exceptions in the hadoop logs. > > > > Any thoughts? > > > > $ /usr/local/hadoop-0.20.2/bin/hadoop fs -copyFromLocal > > Merchandising-ear.tar.gz /tmp/hadoop-test/Merchandising-ear.tar.gz > > 11/03/03 21:58:50 INFO hdfs.DFSClient: Exception in > > createBlockOutputStream java.net.ConnectException: Connection timed > > out > > 11/03/03 21:58:50 INFO hdfs.DFSClient: Abandoning block > > blk_-8243207628973732008_1005 > > 11/03/03 21:58:50 INFO hdfs.DFSClient: Waiting to find target node: > > xx.xx.12:50010 > > 11/03/03 21:59:17 INFO hdfs.DFSClient: Exception in > > createBlockOutputStream java.net.ConnectException: Connection timed > > out > > 11/03/03 21:59:17 INFO hdfs.DFSClient: Abandoning block > > blk_2852127666568026830_1005 > > 11/03/03 21:59:17 INFO hdfs.DFSClient: Waiting to find target node: > > xx.xx.16.12:50010 > > 11/03/03 21:59:44 INFO hdfs.DFSClient: Exception in > > createBlockOutputStream java.net.ConnectException: Connection timed > > out > > 11/03/03 21:59:44 INFO hdfs.DFSClient: Abandoning block > > blk_2284836193463265901_1005 > > 11/03/03 21:59:44 INFO hdfs.DFSClient: Waiting to find target node: > > xx.xx.16.12:50010 > > 11/03/03 22:00:11 INFO hdfs.DFSClient: Exception in > > createBlockOutputStream java.net.ConnectException: Connection timed > > out > > 11/03/03 22:00:11 INFO hdfs.DFSClient: Abandoning block > > blk_-5600915414055250488_1005 > > 11/03/03 22:00:11 INFO hdfs.DFSClient: Waiting to find target node: > > xx.xx.16.11:50010 > > 11/03/03 22:00:17 WARN hdfs.DFSClient: DataStreamer Exception: > java.io.IOException: Unable to create new block. > > at > org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2845) > > at > org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102) > > at > > org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSC > > lient.java:2288) > > > > 11/03/03 22:00:17 WARN hdfs.DFSClient: Error Recovery for block > > blk_-5600915414055250488_1005 bad datanode[0] nodes == null > > 11/03/03 22:00:17 WARN hdfs.DFSClient: Could not get block locations. > Source file "/tmp/hadoop-test/Merchandising-ear.tar.gz" - Aborting... > > copyFromLocal: Connection timed out > > 11/03/03 22:00:17 ERROR hdfs.DFSClient: Exception closing file > > /tmp/hadoop-test/Merchandising-ear.tar.gz : java.net.ConnectException: > > Connection timed out > > java.net.ConnectException: Connection timed out > > at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) > > at > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567) > > at > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) > > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404) > > at > org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.createBlockOutputStream(DFSClient.java:2870) > > at > org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2826) > > at > org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102) > > at > > org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSC > > lient.java:2288) > > [C4554954_admin@c4554954vl03 relevancy]$ > > > > > > > >
