Hello!

Load Cache will pull all rows from result set on all nodes. How many nodes
do you have?

10 million rows is actually a very modest number. I could understand you
worrying if you had 10B rows.

The best approach is to partition your table and push entries to data
streamer from multiple threads, such as, "select * from table where id MOD
? = 0" parametrized with thread number;

Regards,
-- 
Ilya Kasnacheev


вт, 18 февр. 2020 г. в 16:34, nithin91 <
[email protected]>:

> Hi,
>
> I have multiple oracle tables with more than 10 million rows. I want to
> load
> these tables into Ignite cache.To load the cache I am using Cache JDBC Pojo
> Store by getting the required project structure from Web Console.
>
> But Loading the data using cache JDBC POJO Store (i.e.
> ignite.cache("CacheName").loadCache(null)) is taking a lot of time.* Is
> there any alternative approach to load the data from Oracle DB to ignite
> cache*.
>
> Tried using data steamer also but not clear on how to use it.It would be
> helpful if some one can share the
> sample code to load data to ignite cache using data steamer.
>
> I have one doubt reg the usage of Data Steamer,
>
> Following is the process mentioned in documentation to implement Data
> Steamer,
>
>  // Stream words into the streamer cache.
>   for (String word : text)
>     stmr.addData(word, 1L);
> }
>
> But for my case Looping  through the Result Set generated after executing
> the prepared statement using JDBC Connection and add  each row to Data
> Steamer.Will this be efficient as i have to loop through
> 10 million rows.Please correct me if this is not right way of implementing
> Data Steamer.
>
>
>
>
>
>
>
>
> --
> Sent from: http://apache-ignite-users.70518.x6.nabble.com/
>

Reply via email to