We updated hadoop from trunk branch. But now we get new errors:

On tasktarcker side:
<skiped>
java.io.IOException: timed out waiting for response
        at org.apache.hadoop.ipc.Client.call(Client.java:305)
        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:149)
        at org.apache.hadoop.mapred.$Proxy0.pollForTaskWithClosedJob(Unknown
Source)
        at
org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:310)
        at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:374)
        at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:813)
060427 062708 Client connection to 10.0.0.10:9001 caught:
java.lang.RuntimeException:
 java.lang.ClassNotFoundException:
java.lang.RuntimeException: java.lang.ClassNotFoundException:
        at
org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:152)
        at
org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:139)
        at
org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:186)
        at
org.apache.hadoop.io.ObjectWritable.readFields(ObjectWritable.java:60)
        at org.apache.hadoop.ipc.Client$Connection.run(Client.java:170)
060427 062708 Client connection to 10.0.0.10:9001: closing


On jobtracker side:
<skiped>
060427 061713 Server handler 3 on 9001 caught:
java.lang.IllegalArgumentException: Ar
gument is not an array
java.lang.IllegalArgumentException: Argument is not an array
        at java.lang.reflect.Array.getLength(Native Method)
        at
org.apache.hadoop.io.ObjectWritable.writeObject(ObjectWritable.java:92)
        at org.apache.hadoop.io.ObjectWritable.write(ObjectWritable.java:64)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:250)
<skiped>

-----Original Message-----
From: Doug Cutting [mailto:[EMAIL PROTECTED] 
Sent: Thursday, April 27, 2006 12:48 AM
To: [email protected]
Subject: Re: exception
Importance: High

This is a Hadoop DFS error.  It could mean that you don't have any 
datanodes running, or that all your datanodes are full.  Or, it could be 
a bug in dfs.  You might try a recent nightly build of Hadoop to see if 
it works any better.

Doug

Anton Potehin wrote:
> What means error of following type :
> 
>  
> 
> java.rmi.RemoteException: java.io.IOException: Cannot obtain additional
> block for file /user/root/crawl/indexes/index/_0.prx
> 
>  
> 
>  
> 
> 




-------------------------------------------------------
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=120709&bid=263057&dat=121642
_______________________________________________
Nutch-developers mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/nutch-developers

Reply via email to