Lars checked in HBASE-6580 today where HTablePool is Deprecated.

Please take a look.

On Wed, Aug 7, 2013 at 6:08 PM, ch huang <[email protected]> wrote:

>  table.close(); this is not close the table ,just get the connect back to
> pool,because
>
> *putTable methode in HTablePool is Deprecated,see*
>
>
> http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/client/HTablePool.html#putTable(org.apache.hadoop.hbase.client.HTableInterface)
>
>
>
>
> On Wed, Aug 7, 2013 at 9:46 PM, Ted Yu <[email protected]> wrote:
>
> > Table is closed for each Put in writeRow(). This is not efficient.
> >
> > Take a look at http://hbase.apache.org/book.html#client , 9.3.1
> > Connections
> >
> > On Wed, Aug 7, 2013 at 5:11 AM, Lu, Wei <[email protected]> wrote:
> >
> > > decrease cache size (say, 1000)  and increase batch or just set it as
> > > default if #qualifiers in a row is not too many.
> > >
> > >
> > > -----Original Message-----
> > > From: ch huang [mailto:[email protected]]
> > > Sent: Wednesday, August 07, 2013 5:18 PM
> > > To: [email protected]
> > > Subject: issue about search speed and rowkey
> > >
> > > hi,all:
> > >
> > >         i have problem in running following code,MoveData method is
> used
> > to
> > > get data from source table ,and modify each row rowkey ,and insert into
> > > destination table,and i always get error,anyone can help?
> > >
> > >  public static void writeRow(HTablePool htp,String tablename, String
> > > rowkey,String cf,String col,String value) {
> > >         try {
> > >          HTableInterface table =
> htp.getTable(Bytes.toBytes(tablename));
> > >             Put put = new Put(Bytes.toBytes(rowkey));
> > >                 put.add(Bytes.toBytes(cf),
> > >                   Bytes.toBytes(col),
> > >                   Long.parseLong(rowkey),
> > >                   Bytes.toBytes(value));
> > >
> > >                 table.put(put);
> > >             table.close();
> > >         } catch (IOException e) {
> > >             e.printStackTrace();
> > >         }
> > >     }
> > >
> > > public static void MoveData(String src_t,String dest_t){
> > >
> > >     try{
> > >      HTable tabsrc = new HTable(conf, src_t);
> > >      HTable tabdest = new HTable(conf, dest_t);
> > >      tabsrc.setAutoFlush(false);
> > >      tabdest.setAutoFlush(false);
> > >      HTablePool tablePool = new HTablePool(conf, 5);
> > >      Scan scan = new Scan();
> > >      scan.setCaching(10000);
> > >         scan.setBatch(10);
> > >         ResultScanner rs = tabsrc.getScanner(scan);
> > >
> > >            for (Result r : rs){
> > >             ArrayList al = new ArrayList();
> > >             HashMap hm = new HashMap();
> > >             for (KeyValue kv : r.raw()){
> > >
> > >              hm.put(new String(kv.getQualifier()), new
> > > String(kv.getValue()));
> > >              al.add(new String(kv.getQualifier()));
> > >             }
> > >
> > >                for (int i = 0; i < al.size(); i++) {
> > >
> > >
> > >
> >
> writeRow(tablePool,dest_t,hm.get("date").toString(),"info",al.get(i).toString(),hm.get(al.get(i)).toString());
> > >                }
> > >          }
> > >
> > >            rs.close();
> > >            tabsrc.close();
> > >            tabdest.close();
> > >
> > >     }catch(IOException e){
> > >      e.printStackTrace();
> > >     }
> > >
> > >    }
> > >
> > >
> > >
> > > 2013-08-07 16:43:31,250 WARN  [main] conf.Configuration
> > > (Configuration.java:warnOnceIfDeprecated(824)) - hadoop.native.lib is
> > > deprecated. Instead, use io.native.lib.available
> > > java.lang.RuntimeException:
> > > org.apache.hadoop.hbase.client.ScannerTimeoutException: 123891ms passed
> > > since the last invocation, timeout is currently set to 120000
> > >  at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.AbstractClientScanner$1.hasNext(AbstractClientScanner.java:44)
> > >  at com.testme.demo.HBaseTest.MoveData(HBaseTest.java:186)
> > >  at com.testme.demo.HBaseTest.main(HBaseTest.java:314)
> > > Caused by: org.apache.hadoop.hbase.client.ScannerTimeoutException:
> > 123891ms
> > > passed since the last invocation, timeout is currently set to 120000
> > >  at
> > >
> org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.java:283)
> > >  at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.AbstractClientScanner$1.hasNext(AbstractClientScanner.java:41)
> > >  ... 2 more
> > > Caused by: org.apache.hadoop.hbase.UnknownScannerException:
> > > org.apache.hadoop.hbase.UnknownScannerException: Name:
> > -1792412350530007203
> > >  at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2464)
> > >  at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
> > >  at
> > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > >  at java.lang.reflect.Method.invoke(Method.java:597)
> > >  at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:320)
> > >  at
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1426)
> > >
> > >  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> > >  at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown
> Source)
> > >  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown
> > > Source)
> > >  at java.lang.reflect.Constructor.newInstance(Unknown Source)
> > >  at
> > >
> > >
> >
> org.apache.hadoop.hbase.RemoteExceptionHandler.decodeRemoteException(RemoteExceptionHandler.java:96)
> > >  at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:143)
> > >  at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:42)
> > >  at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.ServerCallable.withRetries(ServerCallable.java:163)
> > >  at
> > >
> org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.java:274)
> > >  ... 3 more
> > >
> >
>

Reply via email to