Adding back user@ Can you look at the example in this post and compose startRow accordingly ?
http://stackoverflow.com/questions/18040012/what-is-the-equivalent-of-javas-bytebuffer-wrap-in-c On Tue, Mar 11, 2014 at 7:19 PM, Umesh Chaudhary <[email protected]>wrote: > Hi Ted, > > By giving null to attributes, scanner is working now, thanks for the idea. > > But, when I give Guid.Empty.ToByteArray() as ByteBuffer startRow > parametes's value, I am getting no rows in scannerGet_result. > > Please let me know what value should I pass for start row parameter. > > . > > > > > > > > *From:* Ted Yu [mailto:[email protected]] > *Sent:* Tuesday, March 11, 2014 10:15 PM > *To:* Umesh Chaudhary > > *Subject:* Re: Cannot find row in .META. for table > > > > Have you seen this ? > > > > > http://stackoverflow.com/questions/10078348/byte-collection-based-similar-with-bytebuffer-from-java > > > > Looking > at src/main/java/org/apache/hadoop/hbase/thrift/ThriftServerRunner.java : > > > > private static void addAttributes(OperationWithAttributes op, > > Map<ByteBuffer, ByteBuffer> attributes) { > > if (attributes == null || attributes.size() == 0) { > > return; > > > > You can pass C# equivalent of null for attributes. > > > > Cheers > > On Tue, Mar 11, 2014 at 8:19 AM, Umesh Chaudhary <[email protected]> > wrote: > > Thanks for the reply Ted. I am using Hbase-sharp dll which is portd with > kind-of old scannerOpen() method which has no Map<ByteBuffer,ByteBuffer> > attributes parameter. > I have also generated c# code from new Thrift server in which I am getting > 4 arguments as you have listed. > Now , my concern is in what way I should give {ByteBuffer startRow} and > {Map<ByteBuffer,ByteBuffer> attributes } parameter because I want to get > all rows from the specified table. > > > > -----Original Message----- > From: Ted Yu [mailto:[email protected]] > Sent: Tuesday, March 11, 2014 8:21 PM > To: [email protected] > Subject: Re: Cannot find row in .META. for table > > In src/main//java/org/apache/hadoop/hbase/thrift/generated/Hbase.java , I > found the following scannerOpen() methods: > > public int scannerOpen(ByteBuffer tableName, ByteBuffer startRow, > List<ByteBuffer> columns, Map<ByteBuffer,ByteBuffer> attributes) throws > IOError, org.apache.thrift.TException; > public void scannerOpen(ByteBuffer tableName, ByteBuffer startRow, > List<ByteBuffer> columns, Map<ByteBuffer,ByteBuffer> attributes, > org.apache.thrift.async.AsyncMethodCallback<AsyncClient.scannerOpen_call> > resultHandler) throws org.apache.thrift.TException; > public int scannerOpen(ByteBuffer tableName, ByteBuffer startRow, > List<ByteBuffer> columns, Map<ByteBuffer,ByteBuffer> attributes) throws > IOError, org.apache.thrift.TException > public void scannerOpen(ByteBuffer tableName, ByteBuffer startRow, > List<ByteBuffer> columns, Map<ByteBuffer,ByteBuffer> attributes, > org.apache.thrift.async.AsyncMethodCallback<scannerOpen_call> > resultHandler) throws org.apache.thrift.TException { > > None of the above takes 3 parameters. > > > On Tue, Mar 11, 2014 at 6:05 AM, Umesh Chaudhary <[email protected] > >wrote: > > > I am getting below message while running hbck with/without parameters: > > > > Number of regions: 7 > > Deployed on: jci0.jci.com,60020,1394472660266 > > jci1.jci.com,60020,1394472671945 > > jci2.jci.com,60020,1394472679477 jci3.jci.com,60020,1394472703951 > > 0 inconsistencies detected. > > > > If there are 0 inconsistencies then why I am facing this issue? > > Please check my code: > > > > var rows = _hbase.getRow(table_name, > > BitConverter.GetBytes("Asset"));---> > > where "Asset" is my column family. > > > > OR > > > > var scanner = > > _hbase.scannerOpen(table_name,BitConverter.GetBytes(1),columnsListinBy > > teArray); > > > > Because I am newbie to Thrift API for C#, please suggest how can I > > provide arguments for the same. > > > > > > -----Original Message----- > > From: Jean-Marc Spaggiari [mailto:[email protected]] > > Sent: Tuesday, March 11, 2014 5:13 PM > > To: user > > Subject: Re: Cannot find row in .META. for table > > > > Before using -repair or any other parameter, I will recommend you to > > run it without any parameter to have a sense of what hbck will find. > > > > JM > > > > > > 2014-03-11 7:36 GMT-04:00 divye sheth <[email protected]>: > > > > > You can use the hbck utility to repair these kinds of problems. > > > > > > $ hbase hbck -repair > > > OR > > > $ hbase hbck -fixMeta > > > > > > Thanks > > > Divye Sheth > > > > > > > > > On Tue, Mar 11, 2014 at 4:55 PM, Umesh Chaudhary > > > <[email protected] > > > >wrote: > > > > > > > > > > > Hi, > > > > I am using Hbase 0.94.1 with Hadoop 1.2.1 and using Thrift API to > > > > access tables stored in Hbase from my C# application. I am able to > > > > connect to Server but while going to perform any operation from > > > > client it gives following error in CLI-log: > > > > > > > > 14/03/11 12:18:53 WARN > > > > client.HConnectionManager$HConnectionImplementation: Encountered > > > > problems when prefetch META table: > > > > org.apache.hadoop.hbase.TableNotFoundException: Cannot find row in > > .META. > > > > for table: tblAssetsView, > > > > > > > row=t\x00\x00\x00b\x00\x00\x00l\x00\x00\x00A\x00\x00\x00s\x00\x00\x0 > > > 0s > > > \x00\x00\x00e\x00\x00\x00t\x00\x00\x00s\x00\x00\x00V\x00\x00\x00i\x0 > > > 0\ > > > x00\x00e\x00\x00\x00w\x00\x00\x00,,99999999999999 > > > > at > > > > > > org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:1 > > 51) > > > > at > > > > > > > org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImpleme > > > nt > > > ation.prefetchRegionCache(HConnectionManager.java:1059) > > > > at > > > > > > > org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImpleme > > > nt > > > ation.locateRegionInMeta(HConnectionManager.java:1121) > > > > at > > > > > > > org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImpleme > > > nt > > > ation.locateRegion(HConnectionManager.java:1001) > > > > at > > > > > > > org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImpleme > > > nt > > > ation.locateRegion(HConnectionManager.java:958) > > > > at > > org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:251) > > > > at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:155) > > > > at > > > > > > > org.apache.hadoop.hbase.thrift.ThriftServerRunner$HBaseHandler.getTa > > > bl > > > e(ThriftServerRunner.java:458) > > > > at > > > > > > > org.apache.hadoop.hbase.thrift.ThriftServerRunner$HBaseHandler.getTa > > > bl > > > e(ThriftServerRunner.java:464) > > > > at > > > > > > > org.apache.hadoop.hbase.thrift.ThriftServerRunner$HBaseHandler.getRo > > > wW > > > ithColumnsTs(ThriftServerRunner.java:766) > > > > at > > > > > > > org.apache.hadoop.hbase.thrift.ThriftServerRunner$HBaseHandler.getRo > > > w( > > > ThriftServerRunner.java:739) > > > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > > > at > > > > > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl > > > .j > > > ava:57) > > > > at > > > > > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce > > > ss > > > orImpl.java:43) > > > > at java.lang.reflect.Method.invoke(Method.java:606) > > > > at > > > > > > > org.apache.hadoop.hbase.thrift.HbaseHandlerMetricsProxy.invoke(Hbase > > > Ha > > > ndlerMetricsProxy.java:65) > > > > at com.sun.proxy.$Proxy6.getRow(Unknown Source) > > > > at > > > > > > > org.apache.hadoop.hbase.thrift.generated.Hbase$Processor$getRow.getR > > > es > > > ult(Hbase.java:3906) > > > > at > > > > > > > org.apache.hadoop.hbase.thrift.generated.Hbase$Processor$getRow.getR > > > es > > > ult(Hbase.java:3894) > > > > at > > org.apache.thrift.ProcessFunction.process(ProcessFunction.java:32) > > > > at > org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34) > > > > at > > > > > > > org.apache.hadoop.hbase.thrift.TBoundedThreadPoolServer$ClientConnne > > > ct > > > ion.run(TBoundedThreadPoolServer.java:287) > > > > at > > > > > > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor > > > .j > > > ava:1145) > > > > at > > > > > > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor. > > > java:615) > > > > at java.lang.Thread.run(Thread.java:744) > > > > 14/03/11 12:18:53 WARN thrift.ThriftServerRunner$HBaseHandler: > > > > tblAssetsView > > > > org.apache.hadoop.hbase.TableNotFoundException: tblAssetsView > > > > at > > > > > > > org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImpleme > > > nt > > > ation.locateRegionInMeta(HConnectionManager.java:1139) > > > > at > > > > > > > org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImpleme > > > nt > > > ation.locateRegion(HConnectionManager.java:1001) > > > > at > > > > > > > org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImpleme > > > nt > > > ation.locateRegion(HConnectionManager.java:958) > > > > at > > org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:251) > > > > at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:155) > > > > at > > > > > > > org.apache.hadoop.hbase.thrift.ThriftServerRunner$HBaseHandler.getTa > > > bl > > > e(ThriftServerRunner.java:458) > > > > at > > > > > > > org.apache.hadoop.hbase.thrift.ThriftServerRunner$HBaseHandler.getTa > > > bl > > > e(ThriftServerRunner.java:464) > > > > at > > > > > > > org.apache.hadoop.hbase.thrift.ThriftServerRunner$HBaseHandler.getRo > > > wW > > > ithColumnsTs(ThriftServerRunner.java:766) > > > > at > > > > > > > org.apache.hadoop.hbase.thrift.ThriftServerRunner$HBaseHandler.getRo > > > w( > > > ThriftServerRunner.java:739) > > > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > > > at > > > > > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl > > > .j > > > ava:57) > > > > at > > > > > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce > > > ss > > > orImpl.java:43) > > > > at java.lang.reflect.Method.invoke(Method.java:606) > > > > at > > > > > > > org.apache.hadoop.hbase.thrift.HbaseHandlerMetricsProxy.invoke(Hbase > > > Ha > > > ndlerMetricsProxy.java:65) > > > > at com.sun.proxy.$Proxy6.getRow(Unknown Source) > > > > at > > > > > > > org.apache.hadoop.hbase.thrift.generated.Hbase$Processor$getRow.getR > > > es > > > ult(Hbase.java:3906) > > > > at > > > > > > > org.apache.hadoop.hbase.thrift.generated.Hbase$Processor$getRow.getR > > > es > > > ult(Hbase.java:3894) > > > > at > > org.apache.thrift.ProcessFunction.process(ProcessFunction.java:32) > > > > at > org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34) > > > > at > > > > > > > org.apache.hadoop.hbase.thrift.TBoundedThreadPoolServer$ClientConnne > > > ct > > > ion.run(TBoundedThreadPoolServer.java:287) > > > > at > > > > > > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor > > > .j > > > > ava:1145) > > > > at > > > > > > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor. > > > java:615) > > > > at java.lang.Thread.run(Thread.java:744) > > > > > > > > > > > > > > > > But I can access my table from Hbase Shell with all shell > > > > operations. I > > > am > > > > totally stuck here, please devise some methods to overcome this > issue. > > > > > > > > > > > > Umesh Chaudhary > > > > > > > > > > > >
