[Haskell-cafe] Re: Snow Leopard breaks GHC
My problems were resolved by removing MacPorts from the system and adding 32-bit flags to runhaskell---apparently its zlib was interfering, as well as the runhaskell/runghc problems. Thank you for the advice, Brian On Wed, Sep 9, 2009 at 3:49 AM, Christian Maeder christian.mae...@dfki.de wrote: If compiling template haskell of Pandoc still does not work, please make a ticket as Simon wrote in: http://hackage.haskell.org/trac/ghc/ticket/2965#comment:24 Cheers Christian Christian Maeder wrote: Maybe runhaskell is used for template haskell? HTH Christian Brian Sniffen wrote: No, my ghci is now exec /Library/Frameworks/GHC.framework/Versions/610/usr/bin/ghc-6.10.4 -optc-m32 -opta-m32 -optl-m32 --interactive ${1+$@} and I still see the same result. Also, I have switched to --ld-options instead of --ld-option, which appears to have been a typo---cabal and setup never parsed it. -Brian On Fri, Sep 4, 2009 at 9:51 AM, Christian Maeder christian.mae...@dfki.de wrote: Does adding -optc-m32 -opta-m32 -optl-m32 to /usr/bin/ghci as well not help? (as I've posted before) Cheers Christian Brian Sniffen wrote: Having edited the Haskell Platform's /usr/bin/ghc in place, most packages install fine. I'm still having trouble with Pandoc, even given the advice: Once cabal works, options --ld-option=-m32 (and also --gcc-option=-m32) may be used. These options may also be passed to ./Setup configure The problem appears to come when linking an incompatible zlib version: src/Text/Pandoc/ODT.hs:49:26: Exception when trying to run compile-time code: user error (Codec.Compression.Zlib: incompatible zlib version) Code: ($) makeZip data / odt-styles In the first argument of `read', namely `$(makeZip $ data / odt-styles)' In the expression: read ($(makeZip $ data / odt-styles)) In the definition of `refArchive': refArchive = read ($(makeZip $ data / odt-styles)) The same problem occurs when making any call to Codec.Archive.Zip or Codec.Compression.Zlib. I do have a universal zlib installed by MacPorts, as well as the universal zlib that shipped with Snow Leopard and the universal zlib that came with Cabal. I'm not sure whether this message indicates that TH code is searching a different library path than non-TH code or what. Advice is most welcome. I'm particularly interested in finding out which zlib versions are being found at the construction of Codec.Compression.Zlib and at runtime (Pandoc compile time). -- Brian Sniffen http://evenmere.org/~bts/ b...@evenmere.org ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Re: Snow Leopard breaks GHC
Having edited the Haskell Platform's /usr/bin/ghc in place, most packages install fine. I'm still having trouble with Pandoc, even given the advice: Once cabal works, options --ld-option=-m32 (and also --gcc-option=-m32) may be used. These options may also be passed to ./Setup configure The problem appears to come when linking an incompatible zlib version: src/Text/Pandoc/ODT.hs:49:26: Exception when trying to run compile-time code: user error (Codec.Compression.Zlib: incompatible zlib version) Code: ($) makeZip data / odt-styles In the first argument of `read', namely `$(makeZip $ data / odt-styles)' In the expression: read ($(makeZip $ data / odt-styles)) In the definition of `refArchive': refArchive = read ($(makeZip $ data / odt-styles)) The same problem occurs when making any call to Codec.Archive.Zip or Codec.Compression.Zlib. I do have a universal zlib installed by MacPorts, as well as the universal zlib that shipped with Snow Leopard and the universal zlib that came with Cabal. I'm not sure whether this message indicates that TH code is searching a different library path than non-TH code or what. Advice is most welcome. I'm particularly interested in finding out which zlib versions are being found at the construction of Codec.Compression.Zlib and at runtime (Pandoc compile time). -- Brian Sniffen http://evenmere.org/~bts/ b...@evenmere.org ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Re: Snow Leopard breaks GHC
No, my ghci is now exec /Library/Frameworks/GHC.framework/Versions/610/usr/bin/ghc-6.10.4 -optc-m32 -opta-m32 -optl-m32 --interactive ${1+$@} and I still see the same result. Also, I have switched to --ld-options instead of --ld-option, which appears to have been a typo---cabal and setup never parsed it. -Brian On Fri, Sep 4, 2009 at 9:51 AM, Christian Maeder christian.mae...@dfki.de wrote: Does adding -optc-m32 -opta-m32 -optl-m32 to /usr/bin/ghci as well not help? (as I've posted before) Cheers Christian Brian Sniffen wrote: Having edited the Haskell Platform's /usr/bin/ghc in place, most packages install fine. I'm still having trouble with Pandoc, even given the advice: Once cabal works, options --ld-option=-m32 (and also --gcc-option=-m32) may be used. These options may also be passed to ./Setup configure The problem appears to come when linking an incompatible zlib version: src/Text/Pandoc/ODT.hs:49:26: Exception when trying to run compile-time code: user error (Codec.Compression.Zlib: incompatible zlib version) Code: ($) makeZip data / odt-styles In the first argument of `read', namely `$(makeZip $ data / odt-styles)' In the expression: read ($(makeZip $ data / odt-styles)) In the definition of `refArchive': refArchive = read ($(makeZip $ data / odt-styles)) The same problem occurs when making any call to Codec.Archive.Zip or Codec.Compression.Zlib. I do have a universal zlib installed by MacPorts, as well as the universal zlib that shipped with Snow Leopard and the universal zlib that came with Cabal. I'm not sure whether this message indicates that TH code is searching a different library path than non-TH code or what. Advice is most welcome. I'm particularly interested in finding out which zlib versions are being found at the construction of Codec.Compression.Zlib and at runtime (Pandoc compile time). -- Brian Sniffen http://evenmere.org/~bts/ b...@evenmere.org ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: Re[2]: The programming language market (was Re: [Haskell-cafe] Why functional programming matters
On Jan 27, 2008 3:49 AM, Bulat Ziganshin [EMAIL PROTECTED] wrote: a few months ago i have a conversation with today student and they still learn Lisp (!!!). it seems that they will switch to more modern FP languages no earlier that this concrete professor, head of PL department, which in 60s done interesting AI research, will dead, or at least go to the pension I dunno. Sussman and Abelson are not getting any younger, and neither is Felleisen, but others have taken up that torch. So far, those who waited for Lisp to die out have spent a long time waiting. It has not been a winning bet. -Brian -- Brian T. Sniffen [EMAIL PROTECTED]or[EMAIL PROTECTED] http://www.evenmere.org/~bts ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Doing some things right
On Dec 28, 2007 6:05 AM, Andrew Coppin [EMAIL PROTECTED] wrote: [I actually heard a number of people tell me that learning LISP would change my life forever because LISP has something called macros. I tried to learn it, and disliked it greatly. It's too messy. And what the heck is cdr ment to mean anyway? To me, LISP doesn't even seem all that different from normal languages (modulo weird syntax). Now Haskell... that's FUN!] Contents of Data Register. Macros are like Template Haskell. One example of where they're useful is programmer definition of new binding forms. That's not possible in Haskell without Templates. Macros were invented in Lisp because the syntax is so easy for machine manipulation---they don't have a tenth the complexity of Template Haskell for about the same power. -Brian -- Brian T. Sniffen [EMAIL PROTECTED]or[EMAIL PROTECTED] http://www.evenmere.org/~bts ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: Waiting for thread to finish
On Nov 28, 2007 5:07 PM, Maurício [EMAIL PROTECTED] wrote: Sorry if I sound rude. I just saw a place for a small joke, and used it. Chris code is pretty elegant to what it is supposed to do. However, knowing if a thread has finished is just 1 bit of information. There's probably a reason why that would hurt performance, but I don't understand it. Most threads either communicate some result---and you'll care about setting up a channel for that---or run forever. Some threads run on different computation engines. There's nothing in the Haskell spec that says I have to run the threads on a shared-memory machine. If the threads are distributed, then the channel to communicate back that one has finished may be very expensive. -Brian -- Brian T. Sniffen [EMAIL PROTECTED]or[EMAIL PROTECTED] http://www.evenmere.org/~bts ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] where to put handy functions?
Posix has pretty well taken the name select. It probably isn't a good idea to use that name in a commonly imported library like Data.List, since users will have to mask and qualify it if they also import Posix libraries. -- Brian T. Sniffen [EMAIL PROTECTED]or[EMAIL PROTECTED] http://www.evenmere.org/~bts ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Serializing Functions and Actions for Distributed Programming
I'm very excited by the ability to pass functions or IO actions between threads of the same program. But I don't see any language or library support for doing so between programs, or between sessions with the same program. OCaml provides a partial solution: http://caml.inria.fr/pub/docs/manual-ocaml/libref/Marshal.html Though all it's really sending is an address and a hash of the binary program. Even SerTH doesn't help with functional types. I seek the knowledge of the Haskell Cafe: is there a reasonable way of addressing this problem? -- Brian T. Sniffen [EMAIL PROTECTED]or[EMAIL PROTECTED] http://www.evenmere.org/~bts ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: Hashtable woes
On 2/10/06, Ketil Malde [EMAIL PROTECTED] wrote: Hmm...perhaps it is worth it, then? The benchmark may specify hash table, but I think it is fair to interpret it as associative data structure - after all, people are using associative arrays that (presumably) don't guarantee a hash table underneath, and it can be argued that Data.Map is the canonical way to achieve that in Haskell. Based on this advice, I wrote a k-nucleotide entry using the rough structure of the OCaml entry, but with the manual IO from Chris and Don's Haskell #2 entry. It runs in under 4 seconds on my machine, more than ten times the speed of the fastest Data.HashTable entry. -- Brian T. Sniffen [EMAIL PROTECTED]or[EMAIL PROTECTED] http://www.evenmere.org/~bts ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Re: [Haskell] Re: haskell.org Public Domain
It might be nice to at least include some disclaimers of warranty. I'm not a lawyer. But those US copyright lawyers I've spoken with have expressed doubts about anybody's ability to put things into the public domain. Certainly, if you put it in the public domain, you can't also disclaim a warranty. I suspect that some license that is a superset of the following would be best: * The documentation of GHC * The documentation of Hugs * The Haskell 98 report I think that works out to be BSD, with an option for authors to copyleft pages or code as they wish. On the other hand, I also think MoinMoin has lots to offer over Mediawiki---not least ease of maintenance and improvements such as license markers. -- Brian T. Sniffen [EMAIL PROTECTED]or[EMAIL PROTECTED] http://www.evenmere.org/~bts ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Mixing IO and STM
Here's a version that provides clean output with no delays. It uses a single-entry mailbox (the TMVar output) to ensure the processing doesn't run too far ahead of the log. module Test where import System.Random import Control.Concurrent import Control.Concurrent.STM test :: IO () test = do tv - atomically (newTVar 0) output - atomically (newTMVar Log begins) forkIO (writer output) forkIO (producer tv output) consumer tv output write :: TMVar String - String - STM () write output message = putTMVar output message producer tv o = do r - randomRIO (1,10) atomically $ do v - readTVar tv writeTVar tv (v+r) write o (insert ++ show r) producer tv o return () consumer tv o = do r - randomRIO (1,10) atomically $ do v - readTVar tv if (v r) then retry else writeTVar tv (v-r) write o (consume ++ show r) consumer tv o return () writer :: TMVar String - IO () writer o = do msg - atomically $ takeTMVar o putStrLn msg writer o ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe