Hi everyone, I need some help for loading in large data from file. For instance, I'm trying to load from a text file one line of 50000000 double values (separated by comma). Having 8 byte per double I would assume this to fit into ~400MB of memory, at least when using a java double array internally. However, when loading with read-lines from clojure.contrib.duck-streams and (map #(Double/parseDouble %) (.split line ",")) clojure requires several GB of RAM. Any suggestions for how to get this down to 400MB? And what would be the overhead if reading into a clojure vector, which I really would prefer to using java arrays?
Thanks Johann -- You received this message because you are subscribed to the Google Groups "Clojure" group. To post to this group, send email to clojure@googlegroups.com Note that posts from new members are moderated - please be patient with your first post. To unsubscribe from this group, send email to clojure+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/clojure?hl=en