I have a large CSV file (50 million rows) that i wish to upload to a cache. I
am using .NET and a DataStreamer from my application which is designated as
a client only node.

What i dont understand is i quickly run out of memory on my C# streaming
(client) application while my data node (an instance of Apache.Ignite.exe)
slowly increases RAM usage but not at the rate as my client app does.

So it would seem that either (A) my client IS actually being used to cache
data or (B) there is a memory leak where data that has been sent to the
cache is not released.

As for figures, Apache.Ignite.exe when first started uses 165Mb. After
loading in 1 million records and letting it all settle down,
Apache.Ignite.exe now sits at 450Mb while my client app (the one streaming)
sits at 1.5Gb.

The total size of the input file is 5Gb so 1 million records should really
only be about 100Mb so i dont know how my client even gets to 1.5Gb to begin
with. If i comment out the AddData() then my client never gets past 200Mb so
its certainly something happening in the cache.

Is this expected behaviour? If so then i dont know how to import huge CSV
files without memory issues on the streaming machine.





--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/

Reply via email to