Yes the memory is getting garbage collected at last but because I am
sometimes working on >100Gb files, it shoots out of memory. I thought
reading a line at a  time would let me parse such big files.



On Mon, Mar 31, 2014 at 7:31 AM, Stefan Karpinski <[email protected]>wrote:

> Each line String is newly allocated, so there is garbage produced on each
> iteration of this loop. There are some tricks that could potentially avoid
> this, but we haven't implemented them yet. Is the memory eventually getting
> garbage collected or no?
>
>
> On Sun, Mar 30, 2014 at 2:51 PM, km <[email protected]> wrote:
>
>> Dear All,
>>
>> I am reading a large file (10Gb) as follows
>> open("large_file.txt") do fh
>>     for line in eachline(fh)
>>         println(length(line))
>>      end
>> end
>>
>> It is strange to note that the memory consumption goes up linearly with
>> time. But I would expect it  to be negligible and constant because we are
>> reading only one line at a time.
>>
>> Please let me know.
>>
>> Am I missing something ?
>>
>> Regards,
>> Krishna
>>
>>
>>
>

Reply via email to