regardless there is no point using Sqoop for such purpose. it is not really
designed for it :)

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 21 February 2017 at 10:50, Jörn Franke <jornfra...@gmail.com> wrote:

> From hive to hive another cluster I would use Hive Import/Export, if
> possible with Falcon instead of Sqoop. Sqoop always needs to do
> serialization/deserialization which is fine if the systems are different
> (e.g. Oracle -> Hive), but if it is the same system then usually it makes
> sense to use the tool of the system (e.g. Hive import/export) for
> performance/resource usage reasons.
>
>
> On 21 Feb 2017, at 11:31, Mich Talebzadeh <mich.talebza...@gmail.com>
> wrote:
>
> thanks Jorn.
>
> the idea is to test a big data high availability tool. by ingesting data
> to the target cluster. I have provisioned aan oracle schema for it so we
> can use Sqoop to get data into Hive.
>
> someone suggested try using sqoop to ingest from a hive table in one
> cluster to the target cluster.
>
> this is not really a test is iut?
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
> On 21 February 2017 at 10:26, Jörn Franke <jornfra...@gmail.com> wrote:
>
>> Hallo,
>>
>> I have not tried it, but sqoop supports any jdbc driver. However, since
>> the SQL syntax is not necessarily standardized you may face issues or
>> performance problems. Hive itself has a nice import and export tool that
>> supports also the metadata import/export. It can be orchestrated from Oozie
>> and/or Falcon.
>>
>> Best regards
>>
>> On 21 Feb 2017, at 11:16, Mich Talebzadeh <mich.talebza...@gmail.com>
>> wrote:
>>
>> Hi,
>>
>> I have not tried this but someone mentioned that it is possible to use
>> Sqoop to get data from one Impala/Hive table in one cluster to another?
>>
>> The clusters are in different zones. This is to test the cluster. Has
>> anyone done such a thing?
>>
>> Thanks
>>
>>
>>
>> Dr Mich Talebzadeh
>>
>>
>>
>> LinkedIn * 
>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>> any loss, damage or destruction of data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>>
>

Reply via email to