Hello!

There's no way you will load 20,000 records in 25 minutes. That's 10
records per second. I just can't think of any reason why it might take such
monumental amount of time.

With regards to data streamer, as I have said I recommend partitioning your
data and loading every segment from its own thread, using shared data
streamer instance.

Regards,
-- 
Ilya Kasnacheev


вт, 18 февр. 2020 г. в 20:14, nithin91 <
[email protected]>:

> Hi
>
> We are doing POC, as a result of which we are running it in local mode.
>
> Currently it is taking 25min to load 20000 records with Cache JDBC POJO
> Store.
>
> Even i am giving the initial filter to reduce unnecessary records.
>
>
>
>
> ignite.cache("PieProductRiskCache").loadCache(null,"ignite.example.IgniteUnixImplementation.PieProductRiskKey",
>                         "select *  from Table where
> as_of_Date_Std='31-Dec-2019'");
>
> Regarding the Data Steamer code i have shared, is that the way we implement
> Data Steamers or is there another way of implementing Data Steamers.If the
> approach is correct, then it will not not work right as we are looping
> through the result set.
>
>
>
> --
> Sent from: http://apache-ignite-users.70518.x6.nabble.com/
>

Reply via email to