>  Since we're in 2019, we don't recommend running any Ignite nodes with
-Xmx2G (that is, 2 gigabytes of heap allowance
Does 2019 somehow allow us to consume 2Gb for nothing?
I don't think a client node needs that much.

Let's see a reproducer.
My testing shows that streaming works out of the box on client node, no
custom JVM tuning or anything else required.

On Thu, Nov 14, 2019 at 4:12 PM Ilya Kasnacheev <[email protected]>
wrote:

> Hello!
>
> Since we're in 2019, we don't recommend running any Ignite nodes with
> -Xmx2G (that is, 2 gigabytes of heap allowance).
>
> It is certainly possible to run Ignite with less heap, but the reasoning
> of such is not very clear.
>
> Please also note that our JDBC thin driver supports streaming, and it
> should be usable from .Net in some way. In this case, memory overhead is
> supposed to be small.
>
> Regards,
> --
> Ilya Kasnacheev
>
>
> чт, 14 нояб. 2019 г. в 10:12, camer314 <[email protected]>:
>
>> I have a large CSV file (50 million rows) that i wish to upload to a
>> cache. I
>> am using .NET and a DataStreamer from my application which is designated
>> as
>> a client only node.
>>
>> What i dont understand is i quickly run out of memory on my C# streaming
>> (client) application while my data node (an instance of Apache.Ignite.exe)
>> slowly increases RAM usage but not at the rate as my client app does.
>>
>> So it would seem that either (A) my client IS actually being used to cache
>> data or (B) there is a memory leak where data that has been sent to the
>> cache is not released.
>>
>> As for figures, Apache.Ignite.exe when first started uses 165Mb. After
>> loading in 1 million records and letting it all settle down,
>> Apache.Ignite.exe now sits at 450Mb while my client app (the one
>> streaming)
>> sits at 1.5Gb.
>>
>> The total size of the input file is 5Gb so 1 million records should really
>> only be about 100Mb so i dont know how my client even gets to 1.5Gb to
>> begin
>> with. If i comment out the AddData() then my client never gets past 200Mb
>> so
>> its certainly something happening in the cache.
>>
>> Is this expected behaviour? If so then i dont know how to import huge CSV
>> files without memory issues on the streaming machine.
>>
>>
>>
>>
>>
>> --
>> Sent from: http://apache-ignite-users.70518.x6.nabble.com/
>>
>

Reply via email to