Hi,

Thanks for your quickly reponses.
I tried to relax this limit to 204800, but it still not work.
Is this possible caused from fs objects?

Anyway, thanks a lot!



2009/6/22 zhuweimin <xim-...@tsm.kddilabs.jp>

> Hi
>
> The max open files have limit in LINUX box. Please using ulimit to view and
> modify the limit
> 1.view limit
>   # ulimit -a
> 2.modify limit
>   For example
>   # ulimit -n 10240
>
> Best wish
>
> > -----Original Message-----
> > From: stchu [mailto:stchu.cl...@gmail.com]
> > Sent: Monday, June 22, 2009 12:57 PM
> > To: core-user@hadoop.apache.org
> > Subject: problem about put a lot of files
> >
> > Hi,
> > Is there any restriction on the amount of putting files? I tried to
> > put/copyFromLocal about 50,573 files to HDFS, but I faced a problem:
> > ======================================================================
> > ==========================================================
> > 09/06/22 11:34:34 INFO dfs.DFSClient: Exception in
> > createBlockOutputStream
> > java.io.IOException: Bad connect ack with firstBadLink
> 140.96.89.57:51010
> > 09/06/22 11:34:34 INFO dfs.DFSClient: Abandoning block
> > blk_8245450203753506945_65955
> > 09/06/22 11:34:40 INFO dfs.DFSClient: Exception in
> > createBlockOutputStream
> > java.io.IOException: Bad connect ack with firstBadLink
> 140.96.89.57:51010
> > 09/06/22 11:34:40 INFO dfs.DFSClient: Abandoning block
> > blk_-8257846965500649510_65956
> > 09/06/22 11:34:46 INFO dfs.DFSClient: Exception in
> > createBlockOutputStream
> > java.io.IOException: Bad connect ack with firstBadLink
> 140.96.89.57:51010
> > 09/06/22 11:34:46 INFO dfs.DFSClient: Abandoning block
> > blk_4751737303082929912_65956
> > 09/06/22 11:34:56 INFO dfs.DFSClient: Exception in
> > createBlockOutputStream
> > java.io.IOException: Bad connect ack with firstBadLink
> 140.96.89.57:51010
> > 09/06/22 11:34:56 INFO dfs.DFSClient: Abandoning block
> > blk_5912850890372596972_66040
> > 09/06/22 11:35:02 INFO dfs.DFSClient: Exception in
> > createBlockOutputStream
> > java.io.IOException: Bad connect ack with firstBadLink
> > 140.96.89.193:51010
> > 09/06/22 11:35:02 INFO dfs.DFSClient: Abandoning block
> > blk_6609198685444611538_66040
> > 09/06/22 11:35:08 INFO dfs.DFSClient: Exception in
> > createBlockOutputStream
> > java.io.IOException: Bad connect ack with firstBadLink
> > 140.96.89.193:51010
> > 09/06/22 11:35:08 INFO dfs.DFSClient: Abandoning block
> > blk_6696101244177965180_66040
> > 09/06/22 11:35:17 INFO dfs.DFSClient: Exception in
> > createBlockOutputStream
> > java.io.IOException: Bad connect ack with firstBadLink
> 140.96.89.57:51010
> > 09/06/22 11:35:17 INFO dfs.DFSClient: Abandoning block
> > blk_-5430033105510098342_66105
> > 09/06/22 11:35:26 INFO dfs.DFSClient: Exception in
> > createBlockOutputStream
> > java.io.IOException: Bad connect ack with firstBadLink
> 140.96.89.57:51010
> > 09/06/22 11:35:26 INFO dfs.DFSClient: Abandoning block
> > blk_5325140471333041601_66165
> > 09/06/22 11:35:32 INFO dfs.DFSClient: Exception in
> > createBlockOutputStream
> > java.io.IOException: Bad connect ack with firstBadLink
> > 140.96.89.205:51010
> > 09/06/22 11:35:32 INFO dfs.DFSClient: Abandoning block
> > blk_1121864992752821949_66165
> > 09/06/22 11:35:39 INFO dfs.DFSClient: Exception in
> > createBlockOutputStream
> > java.io.IOException: Bad connect ack with firstBadLink
> > 140.96.89.205:51010
> > 09/06/22 11:35:39 INFO dfs.DFSClient: Abandoning block
> > blk_-2096783021040778965_66184
> > 09/06/22 11:35:45 INFO dfs.DFSClient: Exception in
> > createBlockOutputStream
> > java.io.IOException: Bad connect ack with firstBadLink
> > 140.96.89.205:51010
> > 09/06/22 11:35:45 INFO dfs.DFSClient: Abandoning block
> > blk_6949821898790162970_66184
> > 09/06/22 11:35:51 INFO dfs.DFSClient: Exception in
> > createBlockOutputStream
> > java.io.IOException: Bad connect ack with firstBadLink
> > 140.96.89.205:51010
> > 09/06/22 11:35:51 INFO dfs.DFSClient: Abandoning block
> > blk_4708848202696905125_66184
> > 09/06/22 11:35:57 INFO dfs.DFSClient: Exception in
> > createBlockOutputStream
> > java.io.IOException: Bad connect ack with firstBadLink
> > 140.96.89.205:51010
> > 09/06/22 11:35:57 INFO dfs.DFSClient: Abandoning block
> > blk_8031882012801762201_66184
> > 09/06/22 11:36:03 WARN dfs.DFSClient: DataStreamer Exception:
> > java.io.IOException: Unable to create new block.
> >     at
> > org.apache.hadoop.dfs.DFSClient$DFSOutputStream.nextBlockOutputStream(
> > DFSClient.java:2359)
> >     at
> > org.apache.hadoop.dfs.DFSClient$DFSOutputStream.access$1800(DFSClient.
> > java:1745)
> >     at
> > org.apache.hadoop.dfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSCl
> > ient.java:1922)
> >
> > 09/06/22 11:36:03 WARN dfs.DFSClient: Error Recovery for block
> > blk_8031882012801762201_66184 bad datanode[2]
> > put: Could not get block locations. Aborting...
> > Exception closing file /osmFiles/a/109103.gpx.txt
> > java.io.IOException: Could not get block locations. Aborting...
> >     at
> > org.apache.hadoop.dfs.DFSClient$DFSOutputStream.processDatanodeError(D
> > FSClient.java:2153)
> >     at
> > org.apache.hadoop.dfs.DFSClient$DFSOutputStream.access$1400(DFSClient.
> > java:1745)
> >     at
> > org.apache.hadoop.dfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSCl
> > ient.java:1899)
> >
> > ======================================================================
> > =============================================================
> >
> > And I checked the log file on one of the datanode:
> >
> > ======================================================================
> > =============================================================
> > 2009-06-22 11:34:47,888 INFO org.apache.hadoop.dfs.DataNode:
> > PacketResponder
> > 2 for block blk_1759242372147720864_66183 terminating
> > 2009-06-22 11:34:47,926 INFO org.apache.hadoop.dfs.DataNode: Receiving
> > block
> > blk_-2096783021040778965_66184 src: /140.96.89.224:53984 dest: /
> > 140.96.89.224:51010
> > 2009-06-22 11:34:47,926 INFO org.apache.hadoop.dfs.DataNode: writeBlock
> > blk_-2096783021040778965_66184 received exception java.io.IOException:
> > Too
> > many open files
> > 2009-06-22 11:34:47,926 ERROR org.apache.hadoop.dfs.DataNode:
> > DatanodeRegistration(140.96.89.193:51010,
> > storageID=DS-1452520190-140.96.89.193-51010-1241603100681,
> > infoPort=51075,
> > ipcPort=51020):DataXceiver: java.io.IOException: Too many open files
> >         at sun.nio.ch.IOUtil.initPipe(Native Method)
> >         at
> > sun.nio.ch.EPollSelectorImpl.<init>(EPollSelectorImpl.java:49)
> >         at
> > sun.nio.ch.EPollSelectorProvider.openSelector(EPollSelectorProvider.ja
> > va:18)
> >         at sun.nio.ch.Util.getTemporarySelector(Util.java:123)
> >         at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:92)
> >         at
> > org.apache.hadoop.dfs.DataNode$DataXceiver.writeBlock(DataNode.java:12
> > 54)
> >         at
> > org.apache.hadoop.dfs.DataNode$DataXceiver.run(DataNode.java:1091)
> >         at java.lang.Thread.run(Thread.java:619)
> > ======================================================================
> > =============================================================
> >
> > Does anyone know why it occured and how to solve this problem? Thank you
> > very much!!
>
>
>

Reply via email to