On Wed, Oct 21, 2009 at 9:28 PM, Dmitry Kakurin <dmitry.kaku...@gmail.com>wrote:
> > I have an innocent-looking function that only uses map and reduce, but > when I run it on a big collection (10K elements) I get a StackOverflow > exception. Here is the function: > > (defn multi-filter [filters coll] > (reduce > (fn [res e] > (map (fn [f r] (if (f e) (conj r e) r)) > filters > res)) > (repeat (count filters) []) > coll)) This isn't actually very lazy, which is why you need a lot of heap. It's conjing matching elements onto the vectors in a seq of vectors. Meanwhile the map is generating a lazy seq of new versions of the vectors, and the reduction is wrapping the initial seq of empty vectors in ten thousand layers of map ... fn ... invoke ... map ... etc. which should be showing up in your stack trace as a long repetitive sequence. Reducing a lazy sequence generator like map over a large sequence does not work well in Clojure. Is there a reason not to use (defn multi-filter [filters coll] (map filter filters (repeat coll))) instead? That seems to have the same semantics and gives no problems on my machine, using default heap and stack sizes: user=> (let [[a b c] (multi-filter [even? odd? even?] (range 10000))] [(count a) (count b) (count c)]) [5000 5000 5000] --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Clojure" group. To post to this group, send email to clojure@googlegroups.com Note that posts from new members are moderated - please be patient with your first post. To unsubscribe from this group, send email to clojure+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/clojure?hl=en -~----------~----~----~----~------~----~------~--~---