Hello, During the parse of a fetch of 600 000 pages in a cluster of 5 box,The job failed with this error message on 2 box :
org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.dfs.LeaseExpiredException: No lease on /user/nutch/crawl/segments/20070127060350/crawl_parse/part-00001 at org.apache.hadoop.dfs.FSNamesystem.getAdditionalBlock(FSNamesystem.java:448) at org.apache.hadoop.dfs.NameNode.addBlock(NameNode.java:184) at sun.reflect.GeneratedMethodAccessor18.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:243) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:469) at org.apache.hadoop.ipc.Client$Connection.run(Client.java:159) Did someone already have this probleme and can give me a solution? -- View this message in context: http://www.nabble.com/Lease-expired-exception-tf3130730.html#a8674514 Sent from the Nutch - User mailing list archive at Nabble.com. ------------------------------------------------------------------------- Take Surveys. Earn Cash. Influence the Future of IT Join SourceForge.net's Techsay panel and you'll get the chance to share your opinions on IT & business topics through brief surveys - and earn cash http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV _______________________________________________ Nutch-general mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/nutch-general
