Hi Kiru,

Can you give more information about your data. I can not understand
what is exact problem. I have some question What is your table schema
? What is data size ? What is your heap size ? Avg row size ?. BTW To
tell general something, You can read those sources for Hbase tuning

[1] 
http://www.ericsson.com/research-blog/data-knowledge/hbase-performance-tuners/
[2] http://phoenix.apache.org/tuning.html
[3] http://hbase.apache.org/book.html#performance

HTH
Talat

2015-05-12 3:47 GMT+03:00 Kiru Pakkirisamy <[email protected]>:
> We are trying to benchmark/test Phoenix with large tables.A 'select * from 
> table1 limit 100000' hangs on a 1.4 billion row table (in sqlline.py or 
> SQuirreL)The same select of 1million rows works on smaller table (300 
> million).Mainly we wanted to create a smaller version of the 1.4 billion 
> table and ran into this issue.Any ideas why this is happening ?We had quite a 
> few problems crossing the 1 billion mark even when loading (using 
> CsvBulkLoadTool) the table.We are also wondering whether our HBase is 
> configured correctly.
> Any tips on HBase Configuration for loading/running Phoenix is highly 
> appreciated as well.(We are on HBase 0.98.12 and Phoenix 4.3.1) Regards,
> - kiru



-- 
Talat UYARER
Websitesi: http://talat.uyarer.com
Twitter: http://twitter.com/talatuyarer
Linkedin: http://tr.linkedin.com/pub/talat-uyarer/10/142/304

Reply via email to