Hi Labard,

Could you please explain your use case? Depending on your requirements we
could either suggest you to use some Ignite functionallity, or may be
resort to our Hadoop Accelerator module:
https://apacheignite-fs.readme.io/docs/hadoop-accelerator

Vladimir.

On Tue, Jul 26, 2016 at 6:26 PM, Labard <[email protected]> wrote:

> Hello.
> I'm trying load data from Hadoop(20G) to Ignite. Can anybode tell how
> should
> I do this?
>
> I have 5 servers in cluster. There are 8 ignite nodes and one hadoop node
> on
> each server.
>
> Now I have one client node which take data by jdbc(using Hive) from Hadoop
> and put it by IgniteStream.addData() into Ignite.
> But it is too slow. I assume because all data go through Hadoop nameNode
> and
> then load by one client node.
>
> If I will use loadCache instead of IgniteDataStream, I apprehend that Hive
> database can fall because of 40 queries for 20G files.
>
> In ideal I want to get data straight from Hadoop dataNodes and process a
> small piece on each Ignite node.
> Maybe I can use IGFS integrated with Hadoop, IgniteHadoopSecondryFileSystem
> to achieve this result?
>
> Sincerely, Dmitry
>
>
>
> --
> View this message in context:
> http://apache-ignite-users.70518.x6.nabble.com/Load-data-from-Hadoop-tp6544.html
> Sent from the Apache Ignite Users mailing list archive at Nabble.com.
>

Reply via email to