Thanks @Paulus, @Gary and @Peter,
Rearranging the process to let go of the head is good advice.
I believe the problem (should I need to keep all elements in memory) may
ultimately be lazy collections inside the maps I'm producing.
I saved 1,917 of these elements to disk and it took only 3
For background on "holding onto the head of a sequence" type problems, see
https://stuartsierra.com/2015/04/26/clojure-donts-concat
and
https://stackoverflow.com/questions/15994316/clojure-head-retention
On Tue, Aug 8, 2017 at 6:19 PM, Nathan Smutz wrote:
> The one thing I'm
@Nathan the top-level (def requirement-seq ..) is probably the thing
holding on to all the objects. Try removing the def and calling (last
(sequence (comp ..))) and see if it returns? The purpose of a lazy
sequence is to allow processing to happen one item or chunk at a time, if
there are still
The one thing I'm aware of holding on to is a filtered file-seq:
(def the-files (filter #(s/ends-with? (.getName %) ".xml" ) (rest (file-seq
(io/file dw-path)
There are 7,000+ files; but I'm assuming the elements there are just
file-references and shouldn't take much space.
The rest of the
On Tuesday, 8 August 2017 06:20:56 UTC+1, Nathan Smutz wrote:
> Does this message sometimes present because the non-garbage data is
> getting too big?
>
Yes, it's when most of your heap is non-garbage, so the GC has to keep
running but doesn't succeed in freeing much memory each time.
See
In the course of processing thousands of XML files (maximum size 388kb; but
I am doing a lot of operations with zippers) I got this message:
OutOfMemoryError GC overhead limit exceeded
com.sun.org.apache.xerces.internal.xni.XMLString.toString
I can process about 2,100 before that pops up. I