I assume you are looking at
https://accumulo.apache.org/1.10/examples/export.html

I can think of 2 options

1 - use hadoop's distcp command to copy the table over to the other instance
2 - if you really, really want a local copy then use something like `hdfs
dfs -get hdfs-path local-path`

Mike

On Thu, Sep 16, 2021 at 3:58 PM Ligade, Shailesh [USA] <
ligade_shail...@bah.com> wrote:

> Hello,
>
>
>
> I need to export data to another instance of hdfs for replication baseline
> purposes.
>
>
>
> Can exportable directly export data to local file system? I was planning
> to scp it to other instance
>
>
>
> From shell, using accumulo 1.10, I tried
>
>
>
> exportable -t tname file:///export
>
>
>
> but didn’t work, tried file:// as well as export/ none worked.
>
> Received errors like Failed to create export files Mkdirs failed to create
> file:/export (exist=false, cwd=file:/)
>
>
>
> For file:// it gives AccumuloException
>
> Internal error processing waitForFateOperation
>
>
>
> So is it possible?
>
>
>
> -S
>
>
>

Reply via email to