[
https://issues.apache.org/jira/browse/HBASE-48?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
stack updated HBASE-48:
-----------------------
Attachment: loadtable.rb
Start of a script that will move hfiles into place under hbase.rootdir and that
then updates catalog table. Not finished yet.
> [hbase] Bulk load and dump tools
> --------------------------------
>
> Key: HBASE-48
> URL: https://issues.apache.org/jira/browse/HBASE-48
> Project: Hadoop HBase
> Issue Type: New Feature
> Reporter: stack
> Priority: Minor
> Attachments: 48-v2.patch, 48.patch, loadtable.rb
>
>
> Hbase needs tools to facilitate bulk upload and possibly dumping. Going via
> the current APIs, particularly if the dataset is large and cell content is
> small, uploads can take a long time even when using many concurrent clients.
> PNUTS folks talked of need for a different API to manage bulk upload/dump.
> Another notion would be to somehow have the bulk loader tools somehow write
> regions directly in hdfs.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.