Right, that'll work, but it is no longer lazy in the sense that it
will read the whole sequence into memory (a problem for me because my
sequences are 10s of GB long, compressed).

The feature I was trying to show is that the "yield" function allows
you to make *arbitrary* non-lazy code lazy. (not just for cleanup, but
for anything)

In this particular case, the producer thread will only read 1000
objects ahead before blocking (inside the yield function) and waiting
for the consumers to catch up. Won't blow up memory.

Also, in this example, with-open doesn't really work for producing
lazy sequences. As you pointed out you must read the whole file to
avoid closing too early. However, with 'yield' it works because the
body of the with-yielding has its own thread. The body of the with-
open doesn't finish until the file is completely read.


On Aug 4, 10:56 am, Cameron <cpuls...@gmail.com> wrote:
> Not 100% on this, but this is what I do when reading files...
>
> (with-open [rdr (BufferedReader. (FileReader. file-name))]
>     (reduce conj [] (line-seq rdr)))
>
> That ensures that the whole seq is realized without closing the
> handle, but it also allows you to wrap the whole block with a take
> function if you only cared about the first few lines. As far as I
> know, this would still close the resources after whether you realize
> the whole sequence or only take part of it. Can someone who knows a
> bit better confirm?
>
> On Aug 3, 5:28 pm, Jeff Palmucci <jpalmu...@gmail.com> wrote:
>
>
>
> > See my library athttp://github.com/jpalmucci/clj-yield, which makes
> > this trivial.
>
> > For example, here is a function I use to read a sequence of java
> > serialized objects from a stream:
>
> > (defn read-objects [path]
> >  (with-yielding [out 1000]
> >    (with-open [stream (java.io.ObjectInputStream.
> >                        (java.io.BufferedInputStream.
> >                         (java.util.zip.GZIPInputStream.
> >                          (java.io.FileInputStream. (io/file path)))))]
> >      (loop []
> >        (try
> >         (let [next (.readObject stream)]
> >           (yield out next)
> >           (recur))
> >         (catch java.io.EOFException e (.close stream)))))))
>
> > When the sequence returned by with-yielding becomes garbage
> > collectable, yield will throw an exception causing with-open to close
> > the file.
>
> > Note that with-yielding will use a thread from the thread pool, so its
> > a bad idea to have hundreds of active with-yieldings at once.
>
> > On Aug 3, 3:21 pm, David Andrews <dammi...@gmail.com> wrote:
>
> > > I want to create a lazy seq backed by an open file (or db connection,
> > > or something else that needs cleanup).  I can't wrap the consumer in a
> > > with-anything.
>
> > > Is there a general method for cleaning up after the consumer discards
> > > its reference to that lazy seq?  I'm vaguely aware of Java finalize,
> > > but am also aware that it is unpredictable (e.g. you aren't guaranteed
> > > to be driven at the next gc).
>
> > > Does Clojure even provide the ability to define a finalize method for
> > > a lazy seq?
>
> > > (Sipping water from a firehose...)

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en

Reply via email to