Hi,

We are using Hypertable 0.9.2.3 and Hadoop 0.20.0. We are successful
in installing Hadoop. Hypertable also gets started with hadoop. But
when we try to create any table, hypertable hangs up. But table name
appears in the "show tables" command. If we try to delete this table
Hypertable hangs again.

If we format the namenode again and try to install the hadoop again,
jobtracker doesnt come up, saying port is already in use. And shows
namenode with 0 Live nodes.

Has anyone come across this issue before.

Please help.

Thanks

Here is the log for hadoop-ems-jobtracker-localhost.localdomain
=================================================================================
2009-11-24 18:29:58,556 WARN org.apache.hadoop.hdfs.DFSClient:
NotReplicatedYetException sleeping /tmp/hadoop-root/mapred/system/
jobtracker.info retries left 1
2009-11-24 18:30:01,761 WARN org.apache.hadoop.hdfs.DFSClient:
DataStreamer Exception: org.apache.hadoop.ipc.RemoteException:
java.io.IOException: File /tmp/hadoop-root/mapred/system/
jobtracker.info could only be replicated to 0 nodes, instead of 1
        at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock
(FSNamesystem.java:1256)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock
(NameNode.java:422)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke
(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke
(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:616)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)

        at org.apache.hadoop.ipc.Client.call(Client.java:739)
        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
        at $Proxy4.addBlock(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke
(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke
(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:616)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod
(RetryInvocationHandler.java:82)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke
(RetryInvocationHandler.java:59)
        at $Proxy4.addBlock(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient
$DFSOutputStream.locateFollowingBlock(DFSClient.java:2873)
        at org.apache.hadoop.hdfs.DFSClient
$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2755)
        at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000
(DFSClient.java:2046)
        at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run
(DFSClient.java:2232)

2009-11-24 18:30:01,761 WARN org.apache.hadoop.hdfs.DFSClient: Error
Recovery for block null bad datanode[0] nodes == null
2009-11-24 18:30:01,761 WARN org.apache.hadoop.hdfs.DFSClient: Could
not get block locations. Source file "/tmp/hadoop-root/mapred/system/
jobtracker.info" - Aborting...
2009-11-24 18:30:01,762 WARN org.apache.hadoop.mapred.JobTracker:
Failed to initialize recovery manager. The Recovery manager failed to
access the system files in the system dir (hdfs://localhost:9000/tmp/
hadoop-root/mapred/system).
2009-11-24 18:30:01,765 WARN org.apache.hadoop.mapred.JobTracker: It
might be because the JobTracker failed to read/write system files
(hdfs://localhost:9000/tmp/hadoop-root/mapred/system/jobtracker.info /
hdfs://localhost:9000/tmp/hadoop-root/mapred/system/jobtracker.info.recover)
or the system  file 
hdfs://localhost:9000/tmp/hadoop-root/mapred/system/jobtracker.info
is missing!
2009-11-24 18:30:01,766 WARN org.apache.hadoop.mapred.JobTracker:
Bailing out...
2009-11-24 18:30:01,766 WARN org.apache.hadoop.mapred.JobTracker:
Error starting tracker: org.apache.hadoop.ipc.RemoteException:
java.io.IOException: File /tmp/hadoop-root/mapred/system/
jobtracker.info could only be replicated to 0 nodes, instead of 1
        at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock
(FSNamesystem.java:1256)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock
(NameNode.java:422)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke
(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke
(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:616)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)

        at org.apache.hadoop.ipc.Client.call(Client.java:739)
        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
        at $Proxy4.addBlock(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke
(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke
(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:616)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod
(RetryInvocationHandler.java:82)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke
(RetryInvocationHandler.java:59)
        at $Proxy4.addBlock(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient
$DFSOutputStream.locateFollowingBlock(DFSClient.java:2873)
        at org.apache.hadoop.hdfs.DFSClient
$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2755)
        at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000
(DFSClient.java:2046)
        at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run
(DFSClient.java:2232)

2009-11-24 18:30:02,768 FATAL org.apache.hadoop.mapred.JobTracker:
java.net.BindException: Problem binding to localhost/127.0.0.1:9001 :
Address already in use
        at org.apache.hadoop.ipc.Server.bind(Server.java:190)
        at org.apache.hadoop.ipc.Server$Listener.<init>(Server.java:253)
        at org.apache.hadoop.ipc.Server.<init>(Server.java:1026)
        at org.apache.hadoop.ipc.RPC$Server.<init>(RPC.java:488)
        at org.apache.hadoop.ipc.RPC.getServer(RPC.java:450)
        at org.apache.hadoop.mapred.JobTracker.<init>(JobTracker.java:1537)
        at org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:
174)
        at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:3528)
Caused by: java.net.BindException: Address already in use
        at sun.nio.ch.Net.bind(Native Method)
        at sun.nio.ch.ServerSocketChannelImpl.bind
(ServerSocketChannelImpl.java:137)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:77)
        at org.apache.hadoop.ipc.Server.bind(Server.java:188)
        ... 7 more

2009-11-24 18:30:02,769 INFO org.apache.hadoop.mapred.JobTracker:
SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down JobTracker at localhost.localdomain/
127.0.0.1
************************************************************/

=================================================================================

--

You received this message because you are subscribed to the Google Groups 
"Hypertable Development" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/hypertable-dev?hl=en.


Reply via email to