Yes I run out of memory  and the process gets killed. I am working on files
with 200Gb in size. i hoped to parse one line at a time and save memory.


On Mon, Mar 31, 2014 at 10:57 AM, Stefan Karpinski <
[email protected]> wrote:

> This isn't particularly well optimized at this point, but it's up to be
> addressed very soon. Do you actually run out of memory or the usage just
> grows faster than you'd like?
>
> On Mar 31, 2014, at 1:07 AM, km <[email protected]> wrote:
>
> Yes the memory is getting garbage collected at last but because I am
> sometimes working on >100Gb files, it shoots out of memory. I thought
> reading a line at a  time would let me parse such big files.
>
>
>
> On Mon, Mar 31, 2014 at 7:31 AM, Stefan Karpinski <[email protected]>wrote:
>
>> Each line String is newly allocated, so there is garbage produced on each
>> iteration of this loop. There are some tricks that could potentially avoid
>> this, but we haven't implemented them yet. Is the memory eventually getting
>> garbage collected or no?
>>
>>
>> On Sun, Mar 30, 2014 at 2:51 PM, km <[email protected]> wrote:
>>
>>> Dear All,
>>>
>>> I am reading a large file (10Gb) as follows
>>> open("large_file.txt") do fh
>>>     for line in eachline(fh)
>>>         println(length(line))
>>>      end
>>> end
>>>
>>> It is strange to note that the memory consumption goes up linearly with
>>> time. But I would expect it  to be negligible and constant because we are
>>> reading only one line at a time.
>>>
>>> Please let me know.
>>>
>>> Am I missing something ?
>>>
>>> Regards,
>>> Krishna
>>>
>>>
>>>
>>
>

Reply via email to