byungwok opened a new issue #742: How to move Accumulo table data to HDFS which 
has different structure 
URL: https://github.com/apache/accumulo/issues/742
 
 
   Hi 
   First of all Thank you for your last answer. i'd check that is working well. 
very big thanks.
   
   This question is about export&import accumulo table data from one hdfs to 
the other hdfs which has different Ip, port and directory structure. 
   I already read document about export/import 
(https://accumulo.apache.org/1.7/examples/export) However this paper assumes 
the same device and Hadoop hdfs. Of cause, i know that how to copy hdfs to hdfs 
using 'distcp' and hdfs url.  But Unfortunately, For security  the end hdfs is 
isolated and has different ip and dir structure. So i'd carry the data like 
this 'hdfs -> local -> hdfs' yah it looks so silly and very Inefficient. but 
it's the best so far in my knowlege.  
   
   So i think may re-write 'distcp.txt' and the other files has accumulo table 
data to fit with the 'end hdfs' url structure at local. I'm not sure is it 
correct because i already fail. So in this case, should i change the ip and url 
in all files; distcp.txt, *_info.txt, *.bin, and  table_config.txt? Did you 
have some the other best way to move to this easier?
   Please let me know how to do that
   
   Thank you for reading and good day :)

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to