Yes, thank you for reminding me
2013/9/11 Jean-Marc Spaggiari <[email protected]> > Hi Kun, > > That will give you the HFile information only. One region can have multiple > HFiles. But you can also call this tool giving it the RegionName and not > the HFile. But at the end, that will still give you only the region size. > If you want to have the table size, then you will need to loop over all the > regions. > > JM > > > 2013/9/11 kun yan <[email protected]> > > > Jean-Marc Thank you for the answer is always helpful to me I also learned > > that can be viewed through the following hfile on a region > > $ ./bin/hbase org.apache.hadoop.hbase.io.hfile.HFile > > > > > > > > 2013/9/11 Jean-Marc Spaggiari <[email protected]> > > > > > Hi Kun, > > > > > > If you already have your table, then you can "simply" use the hadoop > > > command line tool to get this information. > > > > > > As en example: > > > $ bin/hadoop fs -du /hbase/ > > > Found 21 items > > > 4811 hdfs://node3:9000/hbase/-ROOT- > > > 1807799 hdfs://node3:9000/hbase/.META. > > > 0 hdfs://node3:9000/hbase/.archive > > > 0 hdfs://node3:9000/hbase/.corrupt > > > 0 hdfs://node3:9000/hbase/.hbck > > > 0 hdfs://node3:9000/hbase/.logs > > > 0 hdfs://node3:9000/hbase/.oldlogs > > > 0 hdfs://node3:9000/hbase/.tmp > > > 45493173 hdfs://node3:9000/hbase/dns > > > 38 hdfs://node3:9000/hbase/hbase.id > > > 3 hdfs://node3:9000/hbase/hbase.version > > > 254783827985 hdfs://node3:9000/hbase/page > > > 134416542 hdfs://node3:9000/hbase/page_proposed > > > 223546827963 hdfs://node3:9000/hbase/work_proposed > > > 276434748 hdfs://node3:9000/hbase/work_sent > > > > > > As you can see, "page" is my biggest table with 237GB. > > > > > > JM > > > > > > > > > 2013/9/11 kun yan <[email protected]> > > > > > > > Hi all How can I know HBase in a table, the table using HDFS storage > > > > space? What > > > > is the command or in the HBase web page I can see?(version 0.94 > hbase) > > > > > > > > > > > > -- > > > > > > > > In the Hadoop world, I am just a novice, explore the entire Hadoop > > > > ecosystem, I hope one day I can contribute their own code > > > > > > > > YanBit > > > > [email protected] > > > > > > > > > > > > > > > -- > > > > In the Hadoop world, I am just a novice, explore the entire Hadoop > > ecosystem, I hope one day I can contribute their own code > > > > YanBit > > [email protected] > > > -- In the Hadoop world, I am just a novice, explore the entire Hadoop ecosystem, I hope one day I can contribute their own code YanBit [email protected]
