Looks like an issue with either your HBase configs that specify the ZK
quorum being off, or ZK itself not responding. If you keep having problems
though, I'm sure the hbase users list would be able to help out pretty
quickly. I'd start with checking that the quorum is properly configured and
that you can connect to it manually from a node.


On Thu, Mar 28, 2013 at 3:25 AM, Praveen Bysani <[email protected]>wrote:

> Hi,
>
> I setup all the nodes using Cloudera Manager. So i assume all the
> classpaths and the environment is handled by the framework (cloudera
> distro), isn't it so ? However after trying to execute on each node, i
> found that on one of my node has problems connecting to hbase. The ip
> address of this node is recently changed from what it was during
> installation. I update the /etc/hosts file on all nodes and restarted all
> hadoop services. The services tab in cloudera manager shows good health for
> all services which made me believe everything is alright, apparently not so.
>
> Trying to access hbase on that particular node gives,
>
> 13/03/28 16:28:14 ERROR zookeeper.RecoverableZooKeeper: ZooKeeper exists
> failed after 3 retries
> 13/03/28 16:28:14 WARN zookeeper.ZKUtil: hconnection Unable to set watcher
> on znode /hbase/master
> org.apache.zookeeper.KeeperException$ConnectionLossException:
> KeeperErrorCode = ConnectionLoss for /hbase/master
>         at
> org.apache.zookeeper.KeeperException.create(KeeperException.java:99)
>         at
> org.apache.zookeeper.KeeperException.create(KeeperException.java:51)
>         at org.apache.zookeeper.ZooKeeper.exists(ZooKeeper.java:1041)
>         at
> org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:176)
>         at
> org.apache.hadoop.hbase.zookeeper.ZKUtil.watchAndCheckExists(ZKUtil.java:418)
>         at
> org.apache.hadoop.hbase.zookeeper.ZooKeeperNodeTracker.start(ZooKeeperNodeTracker.java:82)
>         at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.ensureZookeeperTrackers(HConnectionManager.java:589)
>         at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:648)
>         at
> org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:121)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>         at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>         at
> org.jruby.javasupport.JavaConstructor.newInstanceDirect(JavaConstructor.java:275)
>         at
> org.jruby.java.invokers.ConstructorInvoker.call(ConstructorInvoker.java:91)
>         at
> org.jruby.java.invokers.ConstructorInvoker.call(ConstructorInvoker.java:178)
>         at
> org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:322)
>         at
> org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:178)
>         at
> org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:182)
>         at
> org.jruby.java.proxies.ConcreteJavaProxy$2.call(ConcreteJavaProxy.java:47)
>         at
> org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:322)
>         at
> org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:178)
>
> I understand it is no longer an issue with pig, it would be great if
> someone could give some pointers to configure the hbase on the node that
> has a new ip address.
>
> On 28 March 2013 12:54, Bill Graham <[email protected]> wrote:
>
>> Your initial exception shows ClassNotFoundExceptions for HBase. Are you
>> adding HBase to PIG_CLASSPATH on the client or do you have it installed on
>> your Hadoop nodes? In the case of the latter, maybe some nodes are
>> different than others?
>>
>>
>> On Wed, Mar 27, 2013 at 9:20 PM, Praveen Bysani <[email protected]
>> >wrote:
>>
>> > This is not about casting types. The scripts work sometime without any
>> > issue and fails with the error as i specified before ? I have no clue of
>> > what might be the issue ? Network probably ? I run my cluster on VPS
>> > machines, running CDH 4.2 that is installed using cloudera Manager. I am
>> > running pig version 0.10.1 which is installed as parcel.
>> >
>> > On 27 March 2013 16:29, Praveen Bysani <[email protected]> wrote:
>> >
>> > > Hi,
>> > >
>> > > I am unable to typecast fields loaded from my hbase to anything other
>> > than
>> > > default bytearray. I tried both during the LOAD statement and using
>> > > typecast after loading. Neither works. The script works when i load
>> the
>> > > data as below,
>> > > records = LOAD 'hbase://hantu' USING
>> > > org.apache.pig.backend.hadoop.hbase.HBaseStorage('v:member v:guest')
>> as
>> > > (member, guest);
>> > > records_limit = LIMIT records 10;
>> > > DUMP records_limit;
>> > >
>> > > But when i change the first line to ,
>> > > records = LOAD 'hbase://hantu' USING
>> > > org.apache.pig.backend.hadoop.hbase.HBaseStorage('v:member v:guest')
>> as
>> > > (member:chararray, guest:chararray);
>> > >
>> > > The pig script fails and the log is as below,
>> > > Backend error message
>> > > ---------------------
>> > > Error: java.lang.ClassNotFoundException:
>> > > org.apache.hadoop.hbase.mapreduce.TableInputFormat
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>> > >         at java.security.AccessController.doPrivileged(Native Method)
>> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>> > >         at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>> > >         at java.lang.ClassLoader.defineClass1(Native Method)
>> > >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
>> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
>> > >         at
>> > >
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> > >         at
>> java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>> > >         at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>> > >         at java.security.AccessController.doPrivileged(Native Method)
>> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190
>> > >
>> > > Backend error message
>> > > ---------------------
>> > > Error: java.lang.ClassNotFoundException:
>> > > org.apache.hadoop.hbase.mapreduce.TableInputFormat
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>> > >         at java.security.AccessController.doPrivileged(Native Method)
>> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>> > >         at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>> > >         at java.lang.ClassLoader.defineClass1(Native Method)
>> > >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>> > >         at
>> > >
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> > >         at
>> java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>> > >         at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>> > >         at java.security.AccessController.doPrivileged(Native Method)
>> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190
>> > >
>> > > Error message from task (reduce) task_201303270642_0043_r_000000
>> > > ----------------------------------------------------------------
>> > > ERROR 6015: During execution, encountered a Hadoop error.
>> > >
>> > > org.apache.pig.backend.executionengine.ExecException: ERROR 6015:
>> During
>> > > execution, encountered a Hadoop error.
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>> > >         at java.security.AccessController.doPrivileged(Native Method)
>> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>> > >         at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>> > >         at java.lang.ClassLoader.defineClass1(Native Method)
>> > >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
>> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
>> > >         at
>> > >
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> > >         at
>> java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>> > >         at java.net.URLClassLoader.access(000(URLClassLoader.java:58)
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>> > >         at java.security.AccessController.doPrivileged(Native Method)
>> > > Caused by: java.lang.ClassNotFoundException:
>> > > org.apache.hadoop.hbase.mapreduce.TableInputFormat
>> > >         ... 14 more
>> > >
>> > >
>> >
>> ================================================================================
>> > > Error message from task (reduce) task_201303270642_0043_r_000000
>> > > ----------------------------------------------------------------
>> > > ERROR 6015: During execution, encountered a Hadoop error.
>> > >
>> > > org.apache.pig.backend.executionengine.ExecException: ERROR 6015:
>> During
>> > > execution, encountered a Hadoop error.
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>> > >         at java.security.AccessController.doPrivileged(Native Method)
>> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>> > >         at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>> > >         at java.lang.ClassLoader.defineClass1(Native Method)
>> > >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>> > >         at
>> > >
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> > >         at
>> java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>> > >         at java.net.URLClassLoader.access(000(URLClassLoader.java:58)
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>> > >         at java.security.AccessController.doPrivileged(Native Method)
>> > > Caused by: java.lang.ClassNotFoundException:
>> > > org.apache.hadoop.hbase.mapreduce.TableInputFormat
>> > >         ... 14 more
>> > >
>> > >
>> >
>> ================================================================================
>> > > Pig Stack Trace
>> > > ---------------
>> > > ERROR 6015: During execution, encountered a Hadoop error.
>> > >
>> > > org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066:
>> Unable to
>> > > open iterator for alias bet_records_float. Backend error : During
>> > > execution, encountered a Hadoop error.
>> > >         at org.apache.pig.PigServer.openIterator(PigServer.java:826)
>> > >         at
>> > >
>> org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:696)
>> > >         at
>> > >
>> >
>> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:320)
>> > >         at
>> > >
>> >
>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
>> > >         at
>> > >
>> >
>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
>> > >         at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84)
>> > >         at org.apache.pig.Main.run(Main.java:604)
>> > >         at org.apache.pig.Main.main(Main.java:157)
>> > >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > >         at
>> > >
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> > >         at
>> > >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> > >         at java.lang.reflect.Method.invoke(Method.java:601)
>> > >         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>> > > Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR
>> > > 6015: During execution, encountered a Hadoop error.
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>> > >         at java.security.AccessController.doPrivileged(Native Method)
>> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>> > >         at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>> > >         at java.lang.ClassLoader.defineClass1(Native Method)
>> > >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
>> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
>> > >         at
>> > >
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> > >         at
>> java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>> > >         at java.net.URLClassLoader.access(000(URLClassLoader.java:58)
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>> > >
>> > > The data is valid simple strings, i am not sure what the problem is.
>> > > --
>> > > Regards,
>> > > Praveen Bysani
>> > > http://www.praveenbysani.com
>> > >
>> >
>> >
>> >
>> > --
>> > Regards,
>> > Praveen Bysani
>> > http://www.praveenbysani.com
>> >
>>
>>
>>
>> --
>> *Note that I'm no longer using my Yahoo! email address. Please email me at
>> [email protected] going forward.*
>>
>
>
>
> --
> Regards,
> Praveen Bysani
> http://www.praveenbysani.com
>



-- 
*Note that I'm no longer using my Yahoo! email address. Please email me at
[email protected] going forward.*

Reply via email to