On Wednesday, November 7, 2012 2:29:06 PM UTC-5, Jim foo.bar wrote:
>
> This is exactly the approach I'm taking...'doall' retains the head so
> with massive files it will break...'doseq' will not. at least this is my
> understanding...
>
That is correct. `doall` retains the head because it retu
This is exactly the approach I'm taking...'doall' retains the head so
with massive files it will break...'doseq' will not. at least this is my
understanding...
Jim
On 07/11/12 19:25, Sean Corfield wrote:
I suspect it's considered more idiomatic to do:
(defn process-records [process file-nam
On Wed, Nov 7, 2012 at 11:09 AM, Dave Ray wrote:
> (defn get-records [file-name]
> (with-open [r (reader file-name)]
> (line-seq r)))
I suspect it's considered more idiomatic to do:
(defn process-records [process file-name]
(with-open [r (reader file-name)]
(doseq [line (line-seq r)]
aaa ok this is clearer now...I guess I never wanted/tried to do that -
it seems plain weird to me since I know that there is a try/finally
hidden in there and escaping its scope will render the stream useless by
closing it. anyway thanks for the quick and clear response :)
If you 're dealing a
Hi Dave,
Am 07.11.2012 um 20:09 schrieb Dave Ray:
> There aren't any problems with with-open/doseq/line-seq. The issue is
> with with-open/line-seq. For example, it's instinctive (at least for
> me anyway) to want to write a function like this:
>
> (defn get-records [file-name]
> (with-open [r
There aren't any problems with with-open/doseq/line-seq. The issue is
with with-open/line-seq. For example, it's instinctive (at least for
me anyway) to want to write a function like this:
(defn get-records [file-name]
(with-open [r (reader file-name)]
(line-seq r)))
Of course, the problem
I know I'm coming a bit late in this thread but i did not have the
chance to reply earlier...
Can somebody elaborate briefly what is the problem with the combination
of with-open/doseq/line-seq? In a project of mine I'm dealing with files
larger than 200MB (which of course will not even open o
Stuart,
Thanks for the link. It confirms the suspicions I had about a general
solution for this issue. For the particular code I'm working with,
I'll try pushing with-open further up and see if that gives me some of
the flexibility I'm looking for.
Cheers,
Dave
On Sun, Oct 28, 2012 at 2:21 PM,
On Friday, October 26, 2012 11:11:48 PM UTC-4, daveray wrote:
> I guess I looking for a magical line-seq that closes the file correctly
> even if you consume part of the sequence, is resilient to exceptions,
> etc, etc. I realize that it might be impossible, so I asked. :)
>
It's been discusse
On Sun, Oct 28, 2012 at 9:38 AM, Christian Sperandio
wrote:
>
> I've got a question about lazy-sequence and file reading.
> Is line-seq good to process lines from huge file?
> Let take this case, I want to process each line from a file with one or more
> functions. All lines must be processed. Lin
Just started using Clojure, found myself asking a similar question. Was
writing programs operating as part of unix command shell pipes, so I wrote
a macro that did something like the perl diamond operator. It iterates over
a series of files and standard input, opening each in turn and reading li
I've got a question about lazy-sequence and file reading.
Is line-seq good to process lines from huge file?
Let take this case, I want to process each line from a file with one or
more functions. All lines must be processed. Line-seq return a lazy
sequence, it means all already read lines stay i
Yeah, `read-lines` is what I was referring to.
--
Devin Walters
On Friday, October 26, 2012 at 9:10 PM, Andy Fingerhut wrote:
> Devin, did you mean read-line from the old clojure.contrib.io
> (http://clojure.contrib.io)?
>
> http://clojuredocs.org/clojure_contrib/clojure.contrib.io/read-lin
Andy,
That's the "custom seq that closes the file..." I was referring to. I
guess I looking for a magical line-seq that closes the file correctly
even if you consume part of the sequence, is resilient to exceptions,
etc, etc. I realize that it might be impossible, so I asked. :)
Thanks,
Dave
On
Devin, did you mean read-line from the old clojure.contrib.io?
http://clojuredocs.org/clojure_contrib/clojure.contrib.io/read-lines
Click the "+" symbol next to "Source" to see source code, also available here:
https://github.com/richhickey/clojure-contrib/blob/061f3d5b45657a89faa335ffa2bb80819f
I usually wind up with the line-seq from old contrib. Could you be more clear
about what isn't satisfying about that? For me it usually boils down to: it's
unsatisfying that core line-seq doesn't do that by default.
'(Devin Walters)
On Oct 26, 2012, at 6:45 PM, Dave Ray wrote:
> Hi,
>
> At w
Hi,
At work I've had a few conversations about treating files, especially
large ones, as seqs of lines. In particular, the apparent conflict
between using clojure.core/with-open to ensure a file is closed
appropriately, and clojure.core/line-seq as a generic sequence of
lines which may be consumed
17 matches
Mail list logo