Here is what I tried:
https://gist.github.com/ptupitsyn/7dacefd1cebb936d5f516d8afeba7efe

Ran for a minute or so, 200Mb used on client, 5Gb on server, seems to work
as expected to me.

On Thu, Nov 14, 2019 at 2:14 PM Pavel Tupitsyn <[email protected]> wrote:

> Sounds nasty, can you share a reproducer please?
>
> On Thu, Nov 14, 2019 at 10:12 AM camer314 <[email protected]>
> wrote:
>
>> I have a large CSV file (50 million rows) that i wish to upload to a
>> cache. I
>> am using .NET and a DataStreamer from my application which is designated
>> as
>> a client only node.
>>
>> What i dont understand is i quickly run out of memory on my C# streaming
>> (client) application while my data node (an instance of Apache.Ignite.exe)
>> slowly increases RAM usage but not at the rate as my client app does.
>>
>> So it would seem that either (A) my client IS actually being used to cache
>> data or (B) there is a memory leak where data that has been sent to the
>> cache is not released.
>>
>> As for figures, Apache.Ignite.exe when first started uses 165Mb. After
>> loading in 1 million records and letting it all settle down,
>> Apache.Ignite.exe now sits at 450Mb while my client app (the one
>> streaming)
>> sits at 1.5Gb.
>>
>> The total size of the input file is 5Gb so 1 million records should really
>> only be about 100Mb so i dont know how my client even gets to 1.5Gb to
>> begin
>> with. If i comment out the AddData() then my client never gets past 200Mb
>> so
>> its certainly something happening in the cache.
>>
>> Is this expected behaviour? If so then i dont know how to import huge CSV
>> files without memory issues on the streaming machine.
>>
>>
>>
>>
>>
>> --
>> Sent from: http://apache-ignite-users.70518.x6.nabble.com/
>>
>

Reply via email to