On Friday, February 13, 2015 at 12:47:17 AM UTC-5, Justin Smith wrote:
>
> I don't want to sound too brusque in my defense of Clojure. I'm a huge 
> fan, so criticism of the language does get me a bit defensive.
>

I am a fan as well. Constructive criticism is useful as it can lead to 
improvements.
 

> But preferring different behavior is actually reasonable here. There are 
> other Lisps (and languages in the ML family) that are more internally 
> consistent. I'll continue to prefer the better performance and easy access 
> to host interop, even if it does violate the occasional abstraction. But of 
> course there is no reason to begrudge someone else holding a different 
> opinion, and choosing a language that aligns with those expectations.
>
> But I do hope you give Clojure a chance, it's been very rewarding for me 
> and I expect you'd find the same if you take the time to get to know it and 
> get a feel for what's idiomatic
>

Actually, I've been using it for years. And I've never encountered behavior 
like this, until now. It is definitely wrong for a lazy sequence to either 
a) appear to have the wrong length or b) appear at different times to have 
different lengths. As long as the var isn't rebound, "primes" indisputably 
should appear to have a fixed, immutable value, whether parts of that value 
are lazy via delay/force or otherwise.

It appears that there is a buggy corner case with self-referencing delays, 
where if you (delay code) and "code" has a reference to the delay and tries 
to force it, it sees the delay's value as nil rather than, as it probably 
should, throwing an exception. (Recursively calling the delay's code 
leading to an eventual StackOverflowError seems like the only other 
*sensible* alternative, but if you can detect that case and return nil 
instead you can detect it and throw an exception directly. Something 
straightforward, like an IAE with a message of "Delay tried to force itself 
on evaluation" or something. This would be far preferable to exposing 
visible mutation in a thing that is supposed to be semantically immutable, 
such as (rest x) changing from nil to non-nil, and also exceptions are far 
preferable to returning a logically-incorrect result without complaint.)

Incidentally, I don't see the same behavior using my somewhat old copy of 
Clojure 1.5.1:

=> (def foo (delay (str @foo)))
#'user/foo
=> @foo
StackOverflowError   clojure.core/deref (core.clj:2114)

This is what I'd expect a naive implementation to do. So it looks like in 
1.6 this must have changed to produce nil from the inner @foo, for lazy 
sequences to look like they terminate early from "inside themselves", and 
in 1.7 it *may* have changed again. I think clarification is needed here on 
what is going on. One thing already clear is the preference order of 
possible behaviors:

nil < StackOverflowError < more specific exception such as "delay tried to 
force itself".

Which suggests that 1.5.1 > 1.6, in this particular area...

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to