Hi all,

I have been trying to run MapReduce job that involves using Hbase as source and 
sink. I have Hbase 0.94.2 and Hadoop 2.0 installed using Cloudera repository 
and following their instructions.

When I use HBase client package version 0.94.2 and above, it gave the following 
DNS related error. When I try to use HBase client package 0.92.1 with the Hbase 
version I have installed(0.94.2), everything seems to work fine. But I want to 
use the newer HBase client package and hope someone can tell me what's wrong. 
Furthermore, I hope this is not a bug and that's why I want to raise this issue 
I am facing.

I have disabled IPv6 and not using it at all. I am not sure why it can't parse 
the string for the DNSclient.


13/03/26 05:00:51 INFO zookeeper.ZooKeeper: Client 
environment:java.library.path=/usr/lib/hadoop/libexec/../lib/native/Linux-amd64-64
13/03/26 05:00:51 INFO zookeeper.ZooKeeper: Client 
environment:java.io.tmpdir=/tmp
13/03/26 05:00:51 INFO zookeeper.ZooKeeper: Client 
environment:java.compiler=<NA>
13/03/26 05:00:51 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
13/03/26 05:00:51 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
13/03/26 05:00:51 INFO zookeeper.ZooKeeper: Client 
environment:os.version=2.6.32-279.22.1.el6.x86_64
13/03/26 05:00:51 INFO zookeeper.ZooKeeper: Client environment:user.name=hbase
13/03/26 05:00:51 INFO zookeeper.ZooKeeper: Client 
environment:user.home=/var/run/hbase
13/03/26 05:00:51 INFO zookeeper.ZooKeeper: Client environment:user.dir=/tmp
13/03/26 05:00:51 INFO zookeeper.ZooKeeper: Initiating client connection, 
connectString= <MY DOMAIN NAME>:2181 sessionTimeout=180000 watcher=hconnection
13/03/26 05:00:51 INFO zookeeper.ClientCnxn: Opening socket connection to 
server /46.4.115.71:2181
13/03/26 05:00:51 INFO zookeeper.RecoverableZooKeeper: The identifier of this 
process is 6941@<MY DOMAIN NAME>
13/03/26 05:00:51 WARN client.ZooKeeperSaslClient: SecurityException: 
java.lang.SecurityException: Unable to locate a login configuration occurred 
when trying to find JAAS configuration.
13/03/26 05:00:51 INFO client.ZooKeeperSaslClient: Client will not 
SASL-authenticate because the default JAAS configuration section 'Client' could 
not be found. If you are not using SASL, you may ignore this. On the other 
hand, if you expected SASL to work, please fix your JAAS configuration.
13/03/26 05:00:51 INFO zookeeper.ClientCnxn: Socket connection established to 
<MY DOMAIN NAME>/46.4.115.71:2181, initiating session
13/03/26 05:00:51 INFO zookeeper.ClientCnxn: Session establishment complete on 
server <MY DOMAIN NAME>/46.4.115.71:2181, sessionid = 0x13da4a9a3ea0016, 
negotiated timeout = 40000
13/03/26 05:00:51 INFO mapreduce.TableOutputFormat: Created table instance for 
summary_user
13/03/26 05:00:51 INFO mapred.JobClient: Cleaning up the staging area 
hdfs://<MY DOMAIN NAME>:8020/user/hbase/.staging/job_201303260407_0003

Exception in thread "main" java.lang.NumberFormatException: For input string: 
"4f8:0:a102::add:9999"
        at 
java.lang.NumberFormatException.forInputString(NumberFormatException.java:48)
        at java.lang.Integer.parseInt(Integer.java:458)
        at java.lang.Integer.parseInt(Integer.java:499)
        at com.sun.jndi.dns.DnsClient.<init>(DnsClient.java:103)
        at com.sun.jndi.dns.Resolver.<init>(Resolver.java:44)
        at com.sun.jndi.dns.DnsContext.getResolver(DnsContext.java:553)
        at com.sun.jndi.dns.DnsContext.c_getAttributes(DnsContext.java:413)
        at 
com.sun.jndi.toolkit.ctx.ComponentDirContext.p_getAttributes(ComponentDirContext.java:213)
        at 
com.sun.jndi.toolkit.ctx.PartialCompositeDirContext.getAttributes(PartialCompositeDirContext.java:121)
        at 
com.sun.jndi.toolkit.url.GenericURLDirContext.getAttributes(GenericURLDirContext.java:85)
        at 
javax.naming.directory.InitialDirContext.getAttributes(InitialDirContext.java:123)
        at org.apache.hadoop.net.DNS.reverseDns(DNS.java:85)
        at 
org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.reverseDNS(TableInputFormatBase.java:219)
        at 
org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getSplits(TableInputFormatBase.java:184)
        at 
org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1064)
        at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1081)
        at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:174)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:993)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:946)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
        at 
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:946)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
        at hbase_mapred1.FreqCounter1.main(FreqCounter1.java:86)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:208)


Thank you very much.

Regards,
Heng

Reply via email to