symat opened a new pull request #2777:
URL: https://github.com/apache/hbase/pull/2777


   Currently IntegrationTestImportTsv is generating HFiles under the working 
directory of the hdfs user executing the tool, before bulkloading it into HBase.
   
   Assuming you encrypt the HBase root directory within HDFS (using HDFS 
Transparent Encryption), you can bulkload HFiles only if they sit in the same 
encryption zone in HDFS as the HBase root directory itself.
   
   When IntegrationTestImportTsv is executed against a real distributed cluster 
and the working directory of the current user (e.g. /user/hbase) is not in the 
same encryption zone as the HBase root directory (e.g. /hbase/data) then you 
will get an exception:
   
   ```
   ERROR org.apache.hadoop.hbase.regionserver.HRegion: There was a partial 
failure due to IO when attempting to load d :
   
hdfs://mycluster/user/hbase/test-data/22d8460d-04cc-e032-88ca-2cc20a7dd01c/IntegrationTestImportTsv/hfiles/d/74655e3f8da142cb94bc31b64f0475cc
   
   org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
/user/hbase/test-data/22d8460d-04cc-e032-88ca-2cc20a7dd01c/IntegrationTestImportTsv/hfiles/d/74655e3f8da142cb94bc31b64f0475cc
 can't be moved into an encryption zone.
   ```
   
   In this commit I mmake it configurable where the IntegrationTestImportTsv 
generates the HFiles. From now, one can execute this integration test on HDFS 
Transparent Encryption enabled clusters, like:
   
   ```
   ./bin/hbase org.apache.hadoop.hbase.mapreduce.IntegrationTestImportTsv -D 
IntegrationTestImportTsv.generatedHFileFolder=/<my hbase encryption zone 
path>/testdata
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to