Hi!

If each row is stored as an entry in the cache you can expect an overhead of around 200 byte per entry, so 200MB just for the actual entries (1M) not counting your data (more if you have any index).

You can control the streamer, how much data and when it should be flushed, I have no idea how this work on the .NET client though, so maybe something there, you could try and manually call flush on the streamer at intervals (this is not needed, but just to see if it makes any difference), I use a lot of streamers (from java) and have never had any problems with it so maybe it is something on the .NET side.

Mikael

Den 2019-11-14 kl. 12:14, skrev Pavel Tupitsyn:
Sounds nasty, can you share a reproducer please?

On Thu, Nov 14, 2019 at 10:12 AM camer314 <[email protected] <mailto:[email protected]>> wrote:

    I have a large CSV file (50 million rows) that i wish to upload to
    a cache. I
    am using .NET and a DataStreamer from my application which is
    designated as
    a client only node.

    What i dont understand is i quickly run out of memory on my C#
    streaming
    (client) application while my data node (an instance of
    Apache.Ignite.exe)
    slowly increases RAM usage but not at the rate as my client app does.

    So it would seem that either (A) my client IS actually being used
    to cache
    data or (B) there is a memory leak where data that has been sent
    to the
    cache is not released.

    As for figures, Apache.Ignite.exe when first started uses 165Mb. After
    loading in 1 million records and letting it all settle down,
    Apache.Ignite.exe now sits at 450Mb while my client app (the one
    streaming)
    sits at 1.5Gb.

    The total size of the input file is 5Gb so 1 million records
    should really
    only be about 100Mb so i dont know how my client even gets to
    1.5Gb to begin
    with. If i comment out the AddData() then my client never gets
    past 200Mb so
    its certainly something happening in the cache.

    Is this expected behaviour? If so then i dont know how to import
    huge CSV
    files without memory issues on the streaming machine.





    --
    Sent from: http://apache-ignite-users.70518.x6.nabble.com/

Reply via email to