It's not obvious to me from the code what the problem would be. Have
you tried using a profiler to see what kinds of objects account for
the memory? When I've run into seq issues and bugs in the past, that
was pretty helpful in figuring out the underlying problem.

On Feb 27, 8:13 pm, Sunil S Nandihalli <sunil.nandiha...@gmail.com>
wrote:
> Hi Everybody,
>  I am using lazy-seqs to join two very large csv files. I am very certain
> that I am not holding on to any of the heads and If I did .. the jvm would
> be out of memory far sooner than what I am seeing currently. The size of
> the file is something like 73 G and the Ram allocated to the jvm is about
> 8G . It seems like a very gradual leak. Has anybody else encountered
> similar problems? In case some of you feel that my code might be the
> culprit, the following gist has the source.
>
> https://gist.github.com/1929345
>
> Thanks,
> Sunil.

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en

Reply via email to