Hi All,
As a newbie I try to read up on Clojure whenever I can. One of the
common things that I read is that Clojure is really good for operating
on large data sets, but I haven't seen anyone articulate why that is
aside from alluding to lazy evaluation. So I assume lazy evaluation is
the
Laziness helps when dealing with large data sets, but it's also tricky to
get right. If you mistakenly hold onto the head of a lazy sequence, you
block garbage collection on the entire sequence and usually run out of
memory.
I think Clojure is good for dealing with large data sets for the
There are three reasons:
1. Ignoring the big part for now, clojure is good with data in
general because its functional, has nice datastructures, and a well
designed sequence library. Data manipulation tends to be functional in
nature, moreso than general programming (think map-reduce).
2. The