Paul Johnson <[EMAIL PROTECTED]> writes: > Friedrich wrote: >> I've written just a few programs in Haskell one in a comparison for a >> task I had "nearly daily". >> > The first thing I notice is that this is clearly a direct translation >> From something like Perl. Thats understandable, but I'd suggest > rewriting it with something like this (untested, uncompiled code) Quite a good match, however it was Bash and Awk. I implemente the same in C, Ocaml, Ruby, Tcl/Tk, Haskell, Smallltalk, Java, Common Lisp and IIRC C# ;-)
> > -- Concatenate all the files into one big string. File reading is > lazy, so this won't take all the memory. > getAllFiles :: [String] -> IO String > getAllFiles paths = do > contents <- mapM getFile paths > return $ concat contents > > Then use "lines" to split the result into individual lines and process > them using "filter", "map" and "foldr". Because file reading is lazy, > each line is only read when it is to be processed, and then gets > reaped by the garbage collector. So it all runs in constant memory. Would you mind to elaborate a bit about it. What's so terrible to open one file after the other, reading it line by line and close the file thereafter. Of course it need memory during that but after the closing of the file the memory could be "freed". So what especially makes so much use of memory? > > (By the way, putting in the top level type declarations helps a lot > when you make a mistake.) Well I have my problems with that. Probably it comes from using Languages like Ruby and my special dislike of "typing things" comes especially from Java, C++ (well C is not "innocent" in that regard also. Regards Friedrich -- Q-Software Solutions GmbH; Sitz: Bruchsal; Registergericht: Mannheim Registriernummer: HRB232138; Geschaeftsfuehrer: Friedrich Dominicus _______________________________________________ Haskell mailing list Haskell@haskell.org http://www.haskell.org/mailman/listinfo/haskell