Hey there,

I'm assuming you'd like to use Sqoop to transfer to a local file such that
you can transport it out of your closed environment? If so, I'd check out
"local" fs support:
https://sqoop.apache.org/docs/1.4.5/SqoopUserGuide.html#_using_generic_and_specific_arguments.
Essentially, you can write to the local file system with that.

AFAIK, Sqoop1 doesn't support FS => HDFS data transfers. In Sqoop2, such a
general data transfer use case is being worked on. You can use HDFS "put"
in the mean time I'd imagine.

-Abe

On Wed, Jan 21, 2015 at 4:20 PM, Narasimha Tadepalli <
[email protected]> wrote:

>  We have very complex distributed environment where our Hadoop and oracle
> database are in two different private network infrastructures. We wanna use
> sqoop1 to import the oracle database in avro and then transport that data
> close to HDFS environment. And then export using sqoop1 into Hadoop
> systems. Is there a way I can only run sqoop1 without Hadoop prerequisite
> in the system? May be by just adding few dependency jars to the path?
>
>
>
> Thanks
>
> Narasimha
>

Reply via email to