[Haskell-cafe] Why haskell.org down again?
Hi all, haskell.org down again? Hardware not stable? Attack? Can't we avoid haskell.org off? -- Andy ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Re: Suitable structure to represents lots of similar lists
Eugene Kirpichov wrote: I think Dan is talking about sharing the spine of the lists... How about representing the lists using something along the lines of: data List a = Nil | Leaf a | Cat (List a) (List a) data Transformed a = Changed a | Unchanged a [...] cat :: List a - Transformed (List a) - Transformed (List a) - Transformed (List a) cat xs (Unchanged _) (Unchanged _) = Unchanged xs cat xs (Changed ys') (Unchanged zs) = Changed (Cat ys' zs) cat xs (Unchanged ys) (Changed zs') = Changed (Cat ys zs') cat xs (Changed ys') (Changed zs') = Changed (Cat ys' zs') mapShared' :: (a - Transformed a) - List a - Transformed (List a) mapShared' f x...@nil = Unchanged xs mapShared' f xs@(Leaf a) = case f a of { Unchanged _ - Unchanged xs ; Changed a' - Changed (Leaf a') } mapShared' f xs@(Cat ys zs) = cat xs (mapShared' f ys) (mapShared' f zs) [...] So, looks like we preserve whole 'subtrees' shared if they were not 'changed' by map or filter. Yes, but do you actually gain in terms of space usage? Oh! It appears to me that sometimes you do, namely when the list was heavily shared before applying map and filter. But if it's used ephemerally, you don't gain anything. Regards, Heinrich Apfelmus -- http://apfelmus.nfshost.com ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] GSoC Project: A Haddock + Pandoc documentation tool
I just finished writing my GSoC proposal and I want to have some feedback from the community. I'll try to be brief (this is not the proposal). So where is the proposal? Is there a ticket on GSoC trac? Cheers, Simon ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: Asynchronous exception wormholes kill modularity
Simon Marlow wrote: but they are needlessly complicated, in my opinion. This offers the same functionality: mask :: ((IO a - IO a) - IO b) - IO b mask io = do b - blocked if b then io id else block $ io unblock How does forkIO fit into the picture? That's one point where reasonable code may want to unblock all exceptions unconditionally - for example to allow the thread to be killed later. timeout t io = block $ do result - newEmptyMVar tid - forkIO $ unblock (io = putMVar result) threadDelay t `onException` killThread tid killThread tid tryTakeMVar result regards, Bertram ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: Suitable structure to represents lots of similar lists
2010/4/9 Heinrich Apfelmus apfel...@quantentunnel.de: Eugene Kirpichov wrote: I think Dan is talking about sharing the spine of the lists... How about representing the lists using something along the lines of: data List a = Nil | Leaf a | Cat (List a) (List a) data Transformed a = Changed a | Unchanged a [...] cat :: List a - Transformed (List a) - Transformed (List a) - Transformed (List a) cat xs (Unchanged _) (Unchanged _) = Unchanged xs cat xs (Changed ys') (Unchanged zs) = Changed (Cat ys' zs) cat xs (Unchanged ys) (Changed zs') = Changed (Cat ys zs') cat xs (Changed ys') (Changed zs') = Changed (Cat ys' zs') mapShared' :: (a - Transformed a) - List a - Transformed (List a) mapShared' f x...@nil = Unchanged xs mapShared' f xs@(Leaf a) = case f a of { Unchanged _ - Unchanged xs ; Changed a' - Changed (Leaf a') } mapShared' f xs@(Cat ys zs) = cat xs (mapShared' f ys) (mapShared' f zs) [...] So, looks like we preserve whole 'subtrees' shared if they were not 'changed' by map or filter. Yes, but do you actually gain in terms of space usage? Oh! It appears to me that sometimes you do, namely when the list was heavily shared before applying map and filter. But if it's used ephemerally, you don't gain anything. Yes, in an ephemeral scenario (i.e. compute sum (mapShared (mapShared (filterShared ( xs) we seemingly don't gain anything at all, but it is precisely the scenario where we don't need sharing :) We do gain something if the program's working set of live objects includes many results of mapping and filtering the same list at once: then their sublists will be shared. Regards, Heinrich Apfelmus -- http://apfelmus.nfshost.com ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe -- Eugene Kirpichov Senior Developer, JetBrains ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Re: Announce: hothasktags
On 04/08/2010 01:09 AM, Luke Palmer wrote: On Wed, Apr 7, 2010 at 1:23 AM, Evan Laforgeqdun...@gmail.com wrote: Derive.PitchDeriver Derive/Derive.hs98;file:Cmd/Cmd.hs Derive.PitchDeriver Derive/Derive.hs98;file:Cmd/Play.hs Derive.PitchDeriver Derive/Derive.hs98; file:Cmd/ResponderSync.hs ... [ 20 more ] ... The vim tag documentation says these are static tags, and implies they are meant to apply to symbols only valid within the same file, but this is clearly not the case. Actually, the vim doc implies that only file: is defined, and doesn't talk about scoped tags so I'm not sure what is going on. Anyway, whenever I go to a tag I have to first step through a message that says 1 of 25 or so. There's one for each reference in the tags file, even though those are references in other files. Hmm odd, I don't get that behavior. Is that with the sorted annotation? What version of vim? I get the correct behavior (no additional selection is needed). My vim version is 7.2.385. I did not use sorted annotation but I doubt it has anything to do with it. What's going on? I even checked the current docs at vim.org and they don't mention a file:xyz form either. I think I saw it documented *somewhere*, but now that I look again I can't find it anywhere. Maybe it was in a dream. I hope a newer version of vim didn't remove the support or something... Even if it is not documented it makes sense. When file: is present then it limits the tag to the file which is the argument of file:. If file: does not have any argument then the argument defaults to the file in which the tag is defined. That would mean that exported symbols should be generated without file:, non-exported symbols should be generated with file: which does not have argument. In addition to this qualified symbols should be generated with the qualification and with file:fn where fn is the file name where the symbol is imported as qualified. Ghci's :ctags does not support qualified symbols :-/ Just to clarify it for myself. That would mean that the tags file should look like this: B.xb.hs/^x = 5 :: Int$/;vfile:a.hs B.xb.hs/^x = 5 :: Int$/;vfile:c.hs C.xc.hs/^x = B.x+1$/;vfile:a.hs localActa.hs/^localAct = do$/;vfile: xb.hs/^x = 5 :: Int$/;v xc.hs/^x = B.x+1$/;v ... for these files: === file a.hs === module A () where import qualified B as B import qualified C as C localAct = do print B.x print C.x === file b.hs === module B (x) where x = 5 :: Int === file c.hs === module C (x) where import qualified B as B x = B.x+1 ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Performance question
A lte reply, but if you still need to have circular module depency: 4.6.9. How to compile mutually recursive modules in http://www.haskell.org/ghc/docs/latest/html/users_guide/separate-compilation.html On 21 March 2010 01:31, Arnoldo Muller arnoldomul...@gmail.com wrote: Hello Daniel, Regarding your solution, can I apply {-# SPECIALISE ... #-} statements to datatypes I define? And if so, I am not able to import the datatypes to the module where binarySearch is. The problem is that if I import them a circular dependency is detected and the compiler gives an error. Is there a way of importing a datatype from another module do avoid this circular dependency? Thank you, Arnoldo On Thu, Mar 18, 2010 at 10:48 PM, Daniel Fischer daniel.is.fisc...@web.de wrote: Am Donnerstag 18 März 2010 21:57:34 schrieb Daniel Fischer: Contrary to my expectations, however, using unboxed arrays is slower than straight arrays (in my tests). However, a few {-# SPECIALISE #-} pragmas set the record straight. Specialising speeds up both, boxed and unboxed arrays, significantly, but now, for the specialised types, unboxed arrays are faster (note, however, that when the code for the binary search is in the same module as it is used, with optimisations, GHC will probably specialise it itself. If binarySearch is not exported, AFAIK, you can delete probably.). {-# LANGUAGE BangPatterns #-} module SATBinSearch (binarySearch) where import Data.Array.IArray import Data.Array.Base (unsafeAt) import Data.Bits {-# SPECIALISE binarySearch :: Double - Array Int Double - Int #-} {-# SPECIALISE binarySearch :: Int - Array Int Int - Int #-} {-# SPECIALISE binarySearch :: Bool - Array Int Bool - Int #-} {-# SPECIALISE binarySearch :: Char - Array Int Char - Int #-} {-# SPECIALISE binarySearch :: Float - Array Int Float - Int #-} binarySearch :: Ord a = a - Array Int a - Int binarySearch q a = go l h where (l,h) = bounds a go !lo !hi | hi lo = -(lo+1) | otherwise = case compare mv q of LT - go (m+1) hi EQ - m GT - go lo (m-1) where -- m = lo + (hi-lo) `quot` 2 m = (lo .. hi) + (lo `xor` hi) `shiftR` 1 mv = a `unsafeAt` m Use Data.Array.Unboxed and UArray if possible. Now the bit-fiddling instead of arithmetics makes a serious difference, about 20% for unboxed arrays, 17% for boxed arrays (Double), so I'd recommend that. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe -- Ozgur Akgun ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: Asynchronous exception wormholes kill modularity
On Fri, Apr 9, 2010 at 3:22 AM, Isaac Dupree m...@isaac.cedarswampstudios.org wrote: OK, thanks for the link! In fact, [tell me if my reasoning is wrong...], in that fork-definition, the 'putMVar' will never block, because there is only putMVar one for each created MVar. Yes that's correct. I seem to remember that any execution of putMVar that does not *actually* block is guaranteed not be interrupted by asynchronous exceptions (if within a Control.Exception.block) -- which would be sufficient. Is my memory right or wrong? The following documentation seems to suggest that any function which _may_ itself block is defined as interruptible: http://haskell.org/ghc/docs/latest/html/libraries/base-4.2.0.0/Control-Exception.html#13 That doesn't answer your question precisely however. If it is the case that operations are only interruptible when they actually block then I don't need a nonInterruptibleMask in this last example. However I still need one in my first example because the takeMVar in decrement may absolutely block. regards, Bas ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: Asynchronous exception wormholes kill modularity
On 09/04/2010 09:40, Bertram Felgenhauer wrote: Simon Marlow wrote: but they are needlessly complicated, in my opinion. This offers the same functionality: mask :: ((IO a - IO a) - IO b) - IO b mask io = do b- blocked if b then io id else block $ io unblock How does forkIO fit into the picture? That's one point where reasonable code may want to unblock all exceptions unconditionally - for example to allow the thread to be killed later. Sure, and it works exactly as before in that the new thread inherits the masking state of its parent thread. To unmask exceptions in the child thread you need to use the restore operator passed to the argument of mask. This does mean that if you fork a thread inside mask and don't pass it the restore operation, then it has no way to ever unmask exceptions. At worst, this means you have to pass a restore value around where you didn't previously. timeout t io = block $ do result- newEmptyMVar tid- forkIO $ unblock (io= putMVar result) threadDelay t `onException` killThread tid killThread tid tryTakeMVar result This would be written timeout t io = mask $ \restore - do result- newEmptyMVar tid- forkIO $ restore (io= putMVar result) threadDelay t `onException` killThread tid killThread tid tryTakeMVar result though the version of timeout in System.Timeout is better for various reasons. Cheers, Simon ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Re: Asynchronous exception wormholes kill modularity
On 08/04/2010 21:20, Tyson Whitehead wrote: On March 26, 2010 15:51:42 Isaac Dupree wrote: On 03/25/10 12:36, Simon Marlow wrote: I'd also be amenable to having block/unblock count nesting levels instead, I don't think it would be too hard to implement and it wouldn't require any changes at the library level. Wasn't there a reason that it didn't nest? I think it was that operations that block-as-in-takeMVar, for an unbounded length of time, are always supposed to C.Exception.unblock and in fact be unblocked within that operation. Otherwise the thread might never receive its asynchronous exceptions. If I'm understanding correctly here, it would be nice if you could just go unmask :: IO a - IO a and always have asynchronous exceptions on for the duration of IO a. One solution might be if unmask always enabled asynchronous exception, but, in a masked context, stopped them from propagating beyond the boundary of the IO a action by re-queueing them be re-raised when next allowed. Of course you've then got the problem of umask having to produce a valid value even when IO a was aborted, so you would have to go with something like unmask :: a - IO a - IO a where the first a gets returned if the IO a computation gets aborted by an exception. The original problem code would then go from I haven't seen anyone else asking for this kind of design, and it's quite different to both what we have now and the new proposal. What advantages would this have, do you think? Cheers, Simon ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Re: Simple game: a monad for each player
Gwern Branwen wrote: Yves Parès limestr...@gmail.com wrote: [...] But when running the game, the program cannot switch from a player's monad to another. Do you have any suggestion? Your desires remind me of the MonadPrompt package http://hackage.haskell.org/package/MonadPrompt, which IIRC, has been used in some game demos to provide abstraction from IO/test harness/pure AI etc. The game demo can be found by chasing links from the package documentation: http://int-e.home.tlink.de/haskell/solitaire.tar.gz There's also my package operational http://hackage.haskell.org/package/operational which implements the same concept. It's throughly explained here: http://apfelmus.nfshost.com/articles/operational-monad.html http://projects.haskell.org/operational/ Regards, Heinrich Apfelmus -- http://apfelmus.nfshost.com ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: Asynchronous exception wormholes kill modularity
On Fri, Apr 9, 2010 at 10:40 AM, Bertram Felgenhauer bertram.felgenha...@googlemail.com wrote: How does forkIO fit into the picture? That's one point where reasonable code may want to unblock all exceptions unconditionally - for example to allow the thread to be killed later. timeout t io = block $ do result - newEmptyMVar tid - forkIO $ unblock (io = putMVar result) threadDelay t `onException` killThread tid killThread tid tryTakeMVar result The System.Timeout.timeout function is indeed problematic: http://haskell.org/ghc/docs/latest/html/libraries/base-4.2.0.0/System-Timeout.html To quote the documentation: ...The design of this combinator was guided by the objective that timeout n f should behave exactly the same as f as long as f doesn't time out... and ...It also possible for f to receive exceptions thrown to it by another thread... They seem to contradict each other because when 'f' has asynchronous exceptions blocked 'timeout n f' should also have asynchronous exceptions blocked because it should behave the same, however the latter says that 'f' may always receive asynchronous exceptions. Of course for the timeout function to work correctly 'f' should be able to receive asynchronous exceptions otherwise it won't terminate when the Timeout exception is asynchronously thrown to it: timeout :: Int - IO a - IO (Maybe a) timeout n f | n 0= fmap Just f | n == 0= return Nothing | otherwise = do pid - myThreadId ex - fmap Timeout newUnique handleJust (\e - if e == ex then Just () else Nothing) (\_ - return Nothing) (bracket (forkIO (threadDelay n throwTo pid ex)) (killThread) (\_ - fmap Just f)) now when we rewrite 'bracket', using 'mask' so that it's not an asynchronous exception wormhole anymore, and we apply timeout to a computation in a thread that has asynchronous exceptions blocked the computation won't actually timeout because it won't be able the receive the Timeout exception. I think we just have to live with this and explain it clearly in the documentation of timeout that you should not call it in a masked thread. regards, Bas ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Re: Asynchronous exception wormholes kill modularity
On Wed, Apr 7, 2010 at 5:12 PM, Simon Marlow marlo...@gmail.com wrote: Comments? I have a working implementation, just cleaning it up to make a patch. Can you also take a look at these bugs I reported earlier: http://hackage.haskell.org/trac/ghc/ticket/3944 http://hackage.haskell.org/trac/ghc/ticket/3945 These can also be solved with 'mask'. regards, Bas ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Call for Contributions - Haskell Communities and Activities Report, May 2010 edition
Dear all, I would like to collect contributions for the 18th edition of the Haskell Communities Activities Report http://www.haskell.org/communities/ Submission deadline: 1 May 2010 (please send your contributions to hcar at haskell.org, in plain text or LaTeX format) This is the short story: * If you are working on any project that is in some way related to Haskell, please write a short entry and submit it. Even if the project is very small or unfinished or you think it is not important enough -- please reconsider and submit an entry anyway! * If you are interested in an existing project related to Haskell that has not previously been mentioned in the HCAR, please tell me, so that I can contact the project leaders and ask them to submit an entry. * Feel free to pass on this call for contributions to others that might be interested. More detailed information: The Haskell Communities Activities Report is a bi-annual overview of the state of Haskell as well as Haskell-related projects over the last, and possibly the upcoming six months. If you have only recently been exposed to Haskell, it might be a good idea to browse the November 2009 edition -- you will find interesting topics described as well as several starting points and links that may provide answers to many questions. Contributions will be collected until the submission deadline. They will then be compiled into a coherent report that is published online as soon as it is ready. As always, this is a great opportunity to update your webpages, make new releases, announce or even start new projects, or to talk about developments you want every Haskeller to know about! Looking forward to your contributions, Janis (current editor) FAQ: Q: What format should I write in? A: The required format is a LaTeX source file, adhering to the template that is available at: http://haskell.org/communities/05-2010/template.tex There is also a LaTeX style file at http://haskell.org/communities/05-2010/hcar.sty that you can use to preview your entry. If you do not know LaTeX, then use plain text. If you modify an old entry that you have written for an earlier edition of the report, you should already have received your old entry as a template (provided I have your valid email address). Please modify that template, rather than using your own version of the old entry as a template. Q: Can I include images? A: Yes, you are even encouraged to do so. Please use .jpg format, then. Q: How much should I write? A: Authors are asked to limit entries to about one column of text. This corresponds to approximately one page, or 40 lines of text, with the above style and template. A general introduction is helpful. Apart from that, you should focus on recent or upcoming developments. Pointers to online content can be given for more comprehensive or historic overviews of a project. Images do not count towards the length limit, so you may want to use this opportunity to pep up entries. There is no minimum length of an entry! The report aims for being as complete as possible, so please consider writing an entry, even if it is only a few lines long. Q: Which topics are relevant? A: All topics which are related to Haskell in some way are relevant. We usually had reports from users of Haskell (private, academic, or commercial), from authors or contributors to projects related to Haskell, from people working on the Haskell language, libraries, on language extensions or variants. We also like reports about distributions of Haskell software, Haskell infrastructure, books and tutorials on Haskell. Reports on past and upcoming events related to Haskell are also relevant. Finally, there might be new topics we do not even think about. As a rule of thumb: if in doubt, then it probably is relevant and has a place in the HCAR. You can also ask the editor. Q: Is unfinished work relevant? Are ideas for projects relevant? A: Yes! You can use the HCAR to talk about projects you are currently working on. You can use it to look for other developers that might help you. You can use it to write wishlist items for libraries and language features you would like to see implemented. Q: If I do not update my entry, but want to keep it in the report, what should I do? A: Tell the editor that there are no changes. The old entry will be reused in this case, but it might be dropped if it is older than a year, to give more room and more attention to projects that change a lot. Do not resend complete entries if you have not changed them. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: GSoC: Hackage 2.0
On Thu, Apr 8, 2010 at 11:07 PM, Matthew Gruen wikigraceno...@gmail.com wrote: On Thu, Apr 8, 2010 at 10:58 PM, Antoine Latter aslat...@gmail.com wrote: One thing in the branch over in http://code.haskell.org/hackage-server is the ability for package maintainers to upload documentation to the server. This way we're not tying the ability of the server doing a build-check to the ability to host documentation for a package. Antoine Antoine, hi, I'm still absorbing the great things that have been done in hackage-server so far. I saw some documentation replacement logic in the code, but don't see a web interface for it. Is there one I'm missing? There's nothing so useful! The feature is definitely missing proper tool support ,- at the moment there is only a POST request to a particular address. Antoine ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: True Random Numbers
mo...@deepbondi.net writes: are you using the hackage-released version of random-fu or the darcs one? I was using the hackage version, but since then I switched to the darcs version. (Btw, began using it in some of my projects and I'm really happy about it.) In the above case, I was using IO in the first bgroup and State StdGen in the second. I'm running it on a x86_64 Gentoo Linux box with GHC 6.10.4 and was unable to install Criterion (apparently, impossible is happening while compiling vector-algorithms) so I used 'time' to come up with some results. Below doesn't include IO tests (randomRIO, etc.), since they turned out to be spectacularly slow anyway. Results using ghc -O2. module Main (main) where import Data.Random import Data.List import Control.Monad.State import Control.Monad.Random import System.Random test = p1 `fmap` getStdGen type RType = Double /usr/bin/time results for (test, RType): (p1, Double) : ~3.3 secs (p2, Double) : ~1.7 secs (p3, Double) : ~1.0 sec (p1, Int): ~1.9 secs (p2, Int): ~1.0 sec (p3, Int): ~0.5 sec count = 10 ^ 6 range = (-10, 10) type P = StdGen - [RType] p1 = evalState (sample (replicateM count (uncurry uniform range))) :: P p2 = evalRand (replicateM count (getRandomR range)) :: P p3 = take count . evalRand (getRandomRs range) :: P main = test = (print . foldl' (+) 0) Using 'sum' turned to be rather misleading (took up to a minute to sum up 'Double's; this problem was less apparent for p1), so I had to use foldl' here to get consistent results between 'Int's and 'Double's. '`using` rnf' produced similar results. Also, using DevURandom for random-fu produces almost the same results. -- Gökhan San ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: True Random Numbers
Thanks for the clues, I'll try and make some time this weekend to track it down. I do have some gentoo x64 systems to play with. My first impulse is actually that it is likely due to differences in inlining and/or rewrite rule processing between the GHC versions, but we'll see what turns up. -- James On Apr 9, 2010, at 6:51 AM, Gökhan San wrote: mo...@deepbondi.net writes: are you using the hackage-released version of random-fu or the darcs one? I was using the hackage version, but since then I switched to the darcs version. (Btw, began using it in some of my projects and I'm really happy about it.) In the above case, I was using IO in the first bgroup and State StdGen in the second. I'm running it on a x86_64 Gentoo Linux box with GHC 6.10.4 and was unable to install Criterion (apparently, impossible is happening while compiling vector-algorithms) so I used 'time' to come up with some results. Below doesn't include IO tests (randomRIO, etc.), since they turned out to be spectacularly slow anyway. Results using ghc -O2. module Main (main) where import Data.Random import Data.List import Control.Monad.State import Control.Monad.Random import System.Random test = p1 `fmap` getStdGen type RType = Double /usr/bin/time results for (test, RType): (p1, Double) : ~3.3 secs (p2, Double) : ~1.7 secs (p3, Double) : ~1.0 sec (p1, Int): ~1.9 secs (p2, Int): ~1.0 sec (p3, Int): ~0.5 sec count = 10 ^ 6 range = (-10, 10) type P = StdGen - [RType] p1 = evalState (sample (replicateM count (uncurry uniform range))) :: P p2 = evalRand (replicateM count (getRandomR range)) :: P p3 = take count . evalRand (getRandomRs range) :: P main = test = (print . foldl' (+) 0) Using 'sum' turned to be rather misleading (took up to a minute to sum up 'Double's; this problem was less apparent for p1), so I had to use foldl' here to get consistent results between 'Int's and 'Double's. '`using` rnf' produced similar results. Also, using DevURandom for random-fu produces almost the same results. -- Gökhan San ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: Asynchronous exception wormholes kill modularity
Simon Marlow wrote: On 09/04/2010 09:40, Bertram Felgenhauer wrote: Simon Marlow wrote: mask :: ((IO a - IO a) - IO b) - IO b How does forkIO fit into the picture? That's one point where reasonable code may want to unblock all exceptions unconditionally - for example to allow the thread to be killed later. Sure, and it works exactly as before in that the new thread inherits the masking state of its parent thread. To unmask exceptions in the child thread you need to use the restore operator passed to the argument of mask. This does mean that if you fork a thread inside mask and don't pass it the restore operation, then it has no way to ever unmask exceptions. At worst, this means you have to pass a restore value around where you didn't previously. timeout t io = block $ do result - newEmptyMVar tid - forkIO $ unblock (io = putMVar result) threadDelay t `onException` killThread tid killThread tid tryTakeMVar result This would be written timeout t io = mask $ \restore - do result - newEmptyMVar tid - forkIO $ restore (io = putMVar result) threadDelay t `onException` killThread tid killThread tid tryTakeMVar result I'm worried about the case when this function is called with exceptions already blocked. Then 'restore' will be the identity, and exceptions will continue to be blocked inside the forked thread. You could argue that this is the responsibility of the whole chain of callers (who'd have to supply their own 'restore' functions that will have to be incorporated into the 'io' action), but that goes against modularity. In my opinion there's a valid demand for an escape hatch out of the blocked exception state for newly forked threads. It could be baked into a variant of the forkIO primitive, say forkIOwithUnblock :: ((IO a - IO a) - IO b) - IO ThreadId Kind regards, Bertram ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: Asynchronous exception wormholes kill modularity
On 08/04/2010 06:27, Dean Herington wrote: Is there any reason not to use the more standard uninterruptible instead of noninterruptible? Good point, I'll change that. Cheers, Simon ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: Asynchronous exception wormholes kill modularity
On 09/04/2010 10:33, Bas van Dijk wrote: On Fri, Apr 9, 2010 at 3:22 AM, Isaac Dupree m...@isaac.cedarswampstudios.org wrote: OK, thanks for the link! In fact, [tell me if my reasoning is wrong...], in that fork-definition, the 'putMVar' will never block, because there is only putMVar one for each created MVar. Yes that's correct. I seem to remember that any execution of putMVar that does not *actually* block is guaranteed not be interrupted by asynchronous exceptions (if within a Control.Exception.block) -- which would be sufficient. Is my memory right or wrong? The following documentation seems to suggest that any function which _may_ itself block is defined as interruptible: http://haskell.org/ghc/docs/latest/html/libraries/base-4.2.0.0/Control-Exception.html#13 That doesn't answer your question precisely however. The semantics in our original paper[1] does indeed behave as Isaac described: only if an operation really blocks is it interruptible. However, we've already changed this for throwTo, and potentially we might want to change it for other opertions too. It's tricky to keep that behaviour in a multithreaded runtime, because it introduces extra complexity to distinguish between waiting for a response to a message from another CPU and actually blocking waiting for the other CPU to do something. A concrete example of this is throwTo, which technically should only block if the target thread is inside 'mask', but in practice blocks if the target thread is running on another CPU and the current CPU has just sent a message to the other CPU to request a throwTo. (this is only in GHC 6.14 where we've changed the throwTo protocol to be message-based rather than the previous complicated arrangement of locks). Still, I think I'd advocate using STM and/or atomicModifyIORef in cases like this, where it is much easier to write code that is guaranteed not to block and hence not be interruptible. Cheers, Simon [1] http://www.haskell.org/~simonmar/papers/async.pdf ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] GSoC Project: A Haddock + Pandoc documentation tool
2010/4/9 Johan Tibell johan.tib...@gmail.com: On Fri, Apr 9, 2010 at 4:35 AM, Mark Lentczner ma...@glyphic.com wrote: On Apr 8, 2010, at 6:55 PM, ViaToR (Alvaro V.) wrote: I just finished writing my GSoC proposal ... The project is about creating a new documentation tool for Haskell projects,... I've taken a brief look and this looks lovely. I'm currently deep at work on re-coding the Haddock backend to produce semantic XHTML rather than table-nested HTML. I'm pretty familiar now with the internals of the backends of Haddock and would be happy to help you. General GSoC question: Would this be good time to offer to be a mentor? Do we have too many mentors or would it be useful for me to help out here? If so, do I need to register on the GSoC site today or tomorrow? Yes. Since you know Haddock well and care about the project you should definitely sign up. The student application deadline is today at 19:00 UTC. I think mentors can still join after that deadline (if I read the SoC timeline correctly) but I suggest signing up as a mentor today. During the next few weeks the mentors will decide which projects will get accepted and who will mentor them. I second this. We don't have too many mentors, and we need someone like you with good knowledge of HTML, CSS and the new Haddock backend to mentor a project like this. David ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Re: GSoC: Hackage 2.0
On Wed, 2010-04-07 at 00:40 -0400, Matthew Gruen wrote: Hi Haskellers, I'm Matt Gruen (Gracenotes in #haskell), and the Hackage 2.0 SoC project at http://hackage.haskell.org/trac/summer-of-code/ticket/1587 really piqued my interest. It seems doable, in a summer, to make the new hackage-server more-than-deployment-ready as well as clearing out some items in the hackage bug tracker[0]; so, I've been working on a proposal. In this email I'd like to consolidate my mental notes for haskell-cafe and formulate a roadmap towards a more social Hackage. Great. The most vital part is getting hackage-server http://code.haskell.org/hackage-server/ to a state where it can be switched in place of hackage-scripts http://darcs.haskell.org/hackage-scripts/, and doing it properly, organizing the code so it can be extended painlessly in the future. Yes. I should warn you that I've become increasingly keen on the latter aspect recently. :-) For putting the 2.0 in Hackage 2.0, any interface changes should help the library users and the library writers/uploaders without hurting either of them. Yes, there can sometimes be a bit of a tradoff between users and uploaders. With some proposed features we have to be careful not to annoy one group or the other. Hackage should contain more of the right kind of information. Statistics help everyone, and they're a pretty good gauge on the direction of Hackage as a whole. Package popularity contents are one form of this. Reverse dependencies and even dependency graphs[1] are great, if I can integrate and expand Roel van Dijk's work[2]. Yep, reverse deps are totally doable and really useful. Number of reverse deps, combined with number of downloads is probably a pretty good popularity metric. There should also be some space on package pages, or on pages a link away from them, for users to contribute information and suggestions. Coders can explain why or why not the package met their needs, as a sort of informal bug/enhancement tracking service. Yeah, that's where we've got to be careful. Many packages already have bug trackers and maintainers do not necessarily want yet another website to have to cover to see where users are complaining. I think a user commenting system is probably one of the most tricky bits to design, because of the social aspects. There are issues like not duplicating existing mailing lists / bug trackers / wikis and trying to keep information relevant as new releases come out (eg imaging a comment saying this package is no good because it does not have feature X and yet the current release has feature X). My suggestion is to put this feature further down the TODO list. Another helpful flavor of information is package relationships beyond dependencies: 'Deprecated in favor of Foo', 'a fork of Foo' Yes, deprecation is important. We currently have some support for that, but it's not very good or easy for maintainers to use. There's also a need for a more interactive form of package documentation, but this should strengthen relationships with existing tools like Haddock and Cabal, not bypass the tools. For example, adding a changelog[3] or making Haddock's declaration-by-declaration commentary more wiki-like[4]. Changelogs seem to be within the scope of Hackage 2.0, integrating with Cabal; Haddock wikification might not be, perhaps deserving a separate student-summer session of its own. These can improve the package page and documentation subtrees. Yes, I'd suggest looking at the changelog issue but probably not wiki haddock editing. That would indeed be cool but is a rather bigger scope. More generally, how can library users find the package they want? Search! Metrics! Categories themselves are great, but a tag system could identify and group specific package functionality. There could be sorting by ratings and reviews (4/5 lambdas!). Metadata searches, like those Sascha Böhme implemented in SoC 2007[5], could be integrated. It's not always obvious which ideas will help and which won't see good returns, which makes it all the more important to bring hackage-server to a state where future extensions can be easily written, submitted and deployed. That's the goal here. Again, I suspect this is a feature too far for a GSoC. If we can build the infrastructure which makes adding such features easier then the project would be a success. On the technical side, I realize I'd need to spend a not-insignificant amount of time on a user account system, dealing with authentication and related issues. One additional bit of functionality to manage is the hackage build system, which is used to ensure that packages build and to generate documentation. When building depends on FFI or OS-specific bindings, specific versions of other packages, compiler choice or compiler version choice, including language extensions, this is not trivial. One of two good routes is running cabal server-side to generate build reports and
Re: [Haskell-cafe] GSoC Project: A Haddock + Pandoc documentation tool
2010/4/9 ViaToR (Alvaro V.) alv...@gmail.com: Hello, I just finished writing my GSoC proposal and I want to have some feedback from the community. I'll try to be brief (this is not the proposal). The project is about creating a new documentation tool for Haskell projects, like Sphinx[1] for Python or Scribble[2] for Scheme. We have Haddock, which is a great tool. It can take the sources of a project and generate a useful API reference. But the reference is just a fragment of the whole project documentation. So, Haddock can only do a part of what Sphinx can do. But we have another tool, Pandoc, that takes files of whatever markup language and transform it to another format. And this is the more general-purposed part of Sphinx that is missing in Haddock. So we have the tools for creating documentation as useful as other systems, we just need the glue and several improvements. To achieve this project, first I'll have to use Haddock as an API. Currently, the Haddock API is rudimentary and highly experimental, so I would have to extend and test it. Then I would have to write a Haddock backend which would generate the reference in an internal independent pandoc format [3]. Finally I would have to write a new command-line program that would manage the projects or I would have to add Pandoc support in Haddock command-line program or Haddock support in Pandoc. IMHO, I think that will be better a new command-line program, everyone with its own purpose. I created an example of what would be a library documented with this system [4]. The file contains a configuration file a la cabal, a reST file (It could be markdown) and the html output. I tried to represent that the html output for the reference have to be almost the same that generates Haddock (css may differ). Note that all the entries of the Haddock reference are first-citizens entities in the documentation, so you can refer to the entries, render one entry, all or by groups (there are examples in the reST file). I am looking forward your impressions and suggestions. My main concerns are: 1) Two places containing the API reference. In your example documentation the API reference is included in one of the chapters. Wouldn't it be better to just have it in one place - the Haddock docs? 2) Integration with Haddock docs. I think it would be best if the pages generated by this system and the pages generated by Haddock would be integrated as much as possible - both style wise (sharing CSS, structure, headers, footers, sidebars etc) and also in terms of hyper linking. Identifiers in your documentation should go to the Haddock docs. It should feel as if the Haddock pages are just a part of the whole documentation structure. 3) Configuration I haven't looked at this yet but I suspect people will not want another configuration file in their projects. Perhaps you could propose some kind of Cabal integration instead. David ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: GSoC: Hackage 2.0
I vote for adding a feature that would let people post comments/code snippets to the documentation of other peoples packages :) It would be even nicer if you could post comments to individual haskell definitions on the haddock page, and then hide most of them by default under an expander of some sort. I've often spent time trying to figure out how poorly documented function(s) on someone else's package worked. Once I've figured it out, I usually have a nice little example or explanation that I could post to save other people the same trouble. Having an easy way to do this would be nice. Basically any collaborative/wikish enhancements to the documentation on hackage packages would make me happy :) - Job On Wed, Apr 7, 2010 at 4:43 AM, Matthew Gruen wikigraceno...@gmail.comwrote: On Wed, Apr 7, 2010 at 12:40 AM, Matthew Gruen wikigraceno...@gmail.com wrote: Hi Haskellers, snip Oh, heh, I apologize if that was more of a wall of text than I had realized. The above wasn't a project proposal itself, more the result of some brainstorming and some research. If you have the time to read it, I'd really appreciate your feedback. In fewer words, what kinds of features would benefit the community most for Hackage? /Matt ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] What is the consensus about -fwarn-unused-do-bind ?
As of 6.12.1, the new -fwarn-unused-do-bind warning is activated with -Wall. This is based off a bug report by Neil Mitchell: http://hackage.haskell.org/trac/ghc/ticket/3263 . However, does it make sense for this to be turned on with -Wall? For starters, why should this warning apply only to do blocks and not to explicit usage of , etc.? That is, the following code (as specified in the above bug report) generates an error: do doesFileExist foo return 1 yet this doesn't: doesFileExist foo return 1 If monadic code is going to return an error, then shouldn't _all_ monadic code do so, and not just those in do blocks? Secondly, a fair number of packages seem to be disabling this globally; the packages I know of (from mentions on mailing lists and grepping on /srv/code/*/*.cabal on code.haskell.org) that have the -fno-warn-unused-do-bind option being passed to GHC in their .cabal file include: * HsOpenCL * leksah-server * xmonad (including xmonad-contrib) * xmobar * pandoc My reason for bringing this up is that I'm soon about to release a new version of my graphviz library, and am debating what to do. Note that most of these errors are being caused by usage of a monadic-style of parsing (before anyone tells me I should be using an Applicative style instead, polyparse doesn't support Applicative, so I can't) and as such the return value is being evaluated anyway. The way I see it, I have 4 options: 1. Do as the error suggests and preface usage of these parser combinators with _ -. 2. Use some function of type (Monad m) = m a - m () instead of doing _ -. 3. Duplicate the parser combinators in question so that I have one version that returns a value and another that does the main parser and then returns (); then use this second combinator in do blocks where I don't care about the returned value. 4. Put -fno-warn-unused-do-bind in the .cabal file. The first two options don't appeal to me as being excessive usage of boilerplate; the third involves too much code duplication. However, I am loath to just go and disable a warning globally. What does the Haskell community think? Is -fwarn-unused-do-bind a worthwhile warning (and code should be updated so as not to cause it to find anything to warn about)? Or is it more of a hindrance to be disabled? -- Ivan Lazar Miljenovic ivan.miljeno...@gmail.com IvanMiljenovic.wordpress.com ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What is the consensus about -fwarn-unused-do-bind ?
Ivan Lazar Miljenovic wrote: As of 6.12.1, the new -fwarn-unused-do-bind warning is activated with -Wall. This is based off a bug report by Neil Mitchell: http://hackage.haskell.org/trac/ghc/ticket/3263 . However, does it make sense for this to be turned on with -Wall? For starters, why should this warning apply only to do blocks and not to explicit usage of , etc.? That is, the following code (as specified in the above bug report) generates an error: do doesFileExist foo return 1 yet this doesn't: doesFileExist foo return 1 The comments in that bug report actually mention My patch does not warn on uses of , only in do-notation, where the situation is more clear cut. I take to be an explicit sign that the user wants to ignore the result of the first action, whereas in do-notation it may be an accident. So I think it was the right decision. When I first compiled my CHP library with it, I was surprised to find that I had very few instances: 6 warnings in ~3000 lines of heavily-monadic code. And one or two of those I probably shouldn't be ignoring the return. The way I see it, I have 4 options: 1. Do as the error suggests and preface usage of these parser combinators with _ -. 2. Use some function of type (Monad m) = m a - m () instead of doing _ -. 3. Duplicate the parser combinators in question so that I have one version that returns a value and another that does the main parser and then returns (); then use this second combinator in do blocks where I don't care about the returned value. 4. Put -fno-warn-unused-do-bind in the .cabal file. The first two options don't appeal to me as being excessive usage of boilerplate; the third involves too much code duplication. However, I am loath to just go and disable a warning globally. I'd be tempted by number two, but I it's more typing to write ignore $ than _ -, so maybe 1 is the best option after all. I've frequently encountered the annoyance of monadic return values -- but to satisfy type signatures rather than avoid this warning. For example, I have a CHP parallel operator: (||) :: CHP a - CHP b - CHP (a,b) and a function writeChannel :: Chanout a - a - CHP (). But if you try to write a function like: writeBoth :: a - (Chanout a, Chanout a) - CHP () writeBoth x (outA, outB) = writeChannel outA x || writeChannel outB x You get a type error (expected: CHP (), but got: CHP ((), ()) -- both return types contain no information anyway!). You either have to append return () (or similarly use do-notation), or do as I did and make another form of the operator that discards the output (and there I can't use the add-another-underscore convention!). It's annoying that you end up with lots of operations in libraries duplicated with an underscore variant, or different operators. Sometimes it can make an important semantic difference (e.g. mapM requires Traversable but mapM_ only requires Foldable), but often it's just a matter of no, I don't care about the return value from this (e.g. forkIO). I sometimes wonder if there could be some syntactic sugar that might help, but it does feel like overkill just for this purpose. Thanks, Neil. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: GSoC: Hackage 2.0
On Fri, Apr 9, 2010 at 9:46 AM, Ivan Lazar Miljenovic ivan.miljeno...@gmail.com wrote: Job Vranish job.vran...@gmail.com writes: I vote for adding a feature that would let people post comments/code snippets to the documentation of other peoples packages :) You mean turn every hackage project page into a mini wiki? Yep. It would be even nicer if you could post comments to individual haskell definitions on the haddock page, and then hide most of them by default under an expander of some sort. Rather than, you know, providing the maintainer with a patch with some improved documentation? This is often more difficult than it sounds. The biggest obstacle to this approach is that a new hackage version of the package must to be uploaded to update the documentation and the authors (me included) tend to prefer to push new packages only when there are significant changes. Steps involved currently: 0. pull down package source to build manually 1. add documentation/code snippet to source 2. build haddock documentation 3. debug bad/ugly syntax / missing line breaks that break haddock 4. generate a patch 5. email patch to author 6. wait a week for the author to actually get around to applying the patch to whatever repository the source resides 7. wait several weeks for the author to release the next version of the package Steps involved with mini wiki: 0. add [code] [/code] tags (or whatever) 1. copy 2. paste 3. submit I think making this process easier would greatly increase the community involvement in the generation of documentation and improve the quality of the documentation as a whole. I would imaging that this would not be a trivial task, but I think even something super simple (like what they have for the php documentation) would be much better than nothing. - Job ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] GSoC Project: A Haddock + Pandoc documentation tool
1) Two places containing the API reference. In your example documentation the API reference is included in one of the chapters. Wouldn't it be better to just have it in one place - the Haddock docs? The example is just a test of how you could create a full-customizable single document which includes the Haddock reference. Of course, the idea is that you could create documentation without any Haddock render directive (but keeping the cross references) and then, you could enable the attachment Haddock reference in the same document or in another document. 2) Integration with Haddock docs. I think it would be best if the pages generated by this system and the pages generated by Haddock would be integrated as much as possible - both style wise (sharing CSS, structure, headers, footers, sidebars etc) and also in terms of hyper linking. Identifiers in your documentation should go to the Haddock docs. It should feel as if the Haddock pages are just a part of the whole documentation structure. I agree, and I tried to represent that in the example (but I created a new css). All the documentation will be rendered by Pandoc so there is no difference between the format of each side. I also think it would be interesting to be able to refer to parts of the documentation from Haddock, but keeping the Haddock format (This is not shown in the example). 3) Configuration I haven't looked at this yet but I suspect people will not want another configuration file in their projects. Perhaps you could propose some kind of Cabal integration instead. That would be awesome. I have to see how Cabal handles these files. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: GSoC: Hackage 2.0
On Fri, Apr 9, 2010 at 10:21 AM, Job Vranish job.vran...@gmail.com wrote: On Fri, Apr 9, 2010 at 9:46 AM, Ivan Lazar Miljenovic ivan.miljeno...@gmail.com wrote: Job Vranish job.vran...@gmail.com writes: I vote for adding a feature that would let people post comments/code snippets to the documentation of other peoples packages :) You mean turn every hackage project page into a mini wiki? Yep. How would such annotations/snippets/changes react to the next release of the package? Would they be per-package? per version? -Edward Kmett ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What is the consensus about -fwarn-unused-do-bind ?
(before anyone tells me I should be using an Applicative style instead, polyparse doesn't support Applicative, so I can't) Well, polyparse may not support the Applicative class defined in Control.Applicative, but it _does_ have an applicative interface using other names for the same operators (namely, pure == return, (*) == apply, (*) == discard, (|) = onFail). 2. Use some function of type (Monad m) = m a - m () instead of doing _ -. This function was discussed on the libraries list in the last year or so. I think the consensus name for it was void. Of your 4 alternatives, I quite like this one. Regards, Malcolm ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: GSoC: Hackage 2.0
It would be even nicer if you could post comments to individual haskell definitions on the haddock page, and then hide most of them by default under an expander of some sort. Rather than, you know, providing the maintainer with a patch with some improved documentation? How much cooler it would be, if the wiki-like comment on Hackage could automatically be converted into a darcs/git/whatever patch, and mailed to the package author/maintainer by Hackage itself. Regards, Malcolm ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: GSoC: Hackage 2.0
On Fri, Apr 9, 2010 at 10:31 AM, Edward Kmett ekm...@gmail.com wrote: On Fri, Apr 9, 2010 at 10:21 AM, Job Vranish job.vran...@gmail.comwrote: On Fri, Apr 9, 2010 at 9:46 AM, Ivan Lazar Miljenovic ivan.miljeno...@gmail.com wrote: Job Vranish job.vran...@gmail.com writes: I vote for adding a feature that would let people post comments/code snippets to the documentation of other peoples packages :) You mean turn every hackage project page into a mini wiki? Yep. How would such annotations/snippets/changes react to the next release of the package? Would they be per-package? per version? -Edward Kmett Yeah that's the sticky part. I think I would make comments only apply only to the version of the package they were submitted to, and then make it the package maintainers responsibility to potentially update the hard documentation with the more useful comments when he/she releases the next version. This keeps the comments up to date, and helps prevent things from getting too clogged up with junk comments. It would also be nice to provide an easy way for maintainers to copy comments from older versions to newer ones, but this is a bit more tricky to implement. - Job ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: GSoC: Hackage 2.0
On Fri, Apr 9, 2010 at 10:46 AM, Malcolm Wallace malcolm.wall...@cs.york.ac.uk wrote: It would be even nicer if you could post comments to individual haskell definitions on the haddock page, and then hide most of them by default under an expander of some sort. Rather than, you know, providing the maintainer with a patch with some improved documentation? How much cooler it would be, if the wiki-like comment on Hackage could automatically be converted into a darcs/git/whatever patch, and mailed to the package author/maintainer by Hackage itself. This would indeed be awesome :) Though I think I would prefer to select, from a list of comments, which ones I would like to include, and then click the Download darcs/git/whatever patch button. (rather than get hit by emails ) - Job ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] ANN: darcsum.el 1.2 released
On 09/04/2010 01:27, Bas van Dijk wrote: On Thu, Apr 8, 2010 at 8:41 PM, Simon Michaelsi...@joyful.com wrote: With Christian's blessing, I have taken over maintenance of darcsum and would like to announce the 1.2 release: Nice! I'm a power user of darcsum and I'm definitely going to try out this release. I guess I'm not a power user since I wasn't encountering any problems with the previous version, but I've updated anyway. Darcsum does wonders for my productivity. Thanks Simon! Cheers, Simon ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What is the consensus about -fwarn-unused-do-bind ?
On Fri, Apr 9, 2010 at 10:20 AM, Neil Brown nc...@kent.ac.uk wrote: Ivan Lazar Miljenovic wrote: As of 6.12.1, the new -fwarn-unused-do-bind warning is activated with -Wall. This is based off a bug report by Neil Mitchell: http://hackage.haskell.org/trac/ghc/ticket/3263 . However, does it make sense for this to be turned on with -Wall? For starters, why should this warning apply only to do blocks and not to explicit usage of , etc.? That is, the following code (as specified in the above bug report) generates an error: do doesFileExist foo return 1 yet this doesn't: doesFileExist foo return 1 The comments in that bug report actually mention My patch does not warn on uses of , only in do-notation, where the situation is more clear cut. I take to be an explicit sign that the user wants to ignore the result of the first action, whereas in do-notation it may be an accident. So I think it was the right decision. Relevant link: http://neilmitchell.blogspot.com/2008/12/mapm-mapm-and-monadic-statements.html 2. Use some function of type (Monad m) = m a - m () instead of doing _ -. 3. Duplicate the parser combinators in question so that I have one version that returns a value and another that does the main parser and then returns (); then use this second combinator in do blocks where I don't care about the returned value. 4. Put -fno-warn-unused-do-bind in the .cabal file. The first two options don't appeal to me as being excessive usage of boilerplate; the third involves too much code duplication. However, I am loath to just go and disable a warning globally. I'd be tempted by number two, but I it's more typing to write ignore $ than _ -, so maybe 1 is the best option after all. I've frequently encountered the annoyance of monadic return values -- but to satisfy type signatures rather than avoid this warning. For example, I have a CHP parallel operator: (||) :: CHP a - CHP b - CHP (a,b) and a function writeChannel :: Chanout a - a - CHP (). But if you try to write a function like: It's actually going to be named 'void': http://hackage.haskell.org/trac/ghc/ticket/3292 I don't think it's made it into a stable release yet. -- gwern ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Type constrain in instance?
Hi, I got an issue when playing haskell's type system, partically parametic class/instance have no way to specify constrains of their parameters. for example, i want to data struct to chain functions and their derivatives so we can have combined function and derivations, as following import qualified Control.Category as Cat data ChainableFunction a b = CF { cfF :: (a-b), cfDeriv :: (a-b) } class Module a b wher4e (*) :: a - b - b instance Cat.Category CF where id :: (Num a) -- GHC dis-allow this id = CF id (const 1) (.) :: (Num a, Num b, Num c, Module a b, Module b c) = CF a b - CF b c - CF a c -- GHC disallow this either (.) (CF f f') (CF g g') = CF (g.f) (\a - f' a * g' (f a)) However GHC only has kinds for class/instance like (*-*-*) so we are forced to allow all possible types in instance code. I'm not sure if I'm modelling things correctly or is there another way to do the same thing? Cheers, Louis ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: GSoC: Hackage 2.0
On Fri, Apr 9, 2010 at 10:59 AM, Job Vranish job.vran...@gmail.com wrote: On Fri, Apr 9, 2010 at 10:46 AM, Malcolm Wallace malcolm.wall...@cs.york.ac.uk wrote: How much cooler it would be, if the wiki-like comment on Hackage could automatically be converted into a darcs/git/whatever patch, and mailed to the package author/maintainer by Hackage itself. This would indeed be awesome :) Though I think I would prefer to select, from a list of comments, which ones I would like to include, and then click the Download darcs/git/whatever patch button. (rather than get hit by emails ) - Job That's what this proposal is, suggested a year ago: http://www.reddit.com/r/haskell/comments/8bylw/ask_haskell_reddit_how_can_we_improve_the/c08tc0q The wiki way to go about it where changes can still be sent upstream is to make patches for every documentation change (using a simple editing interface, not the HTML source), and update Hackage's Haddock tree with the patch. The author can then apply the patches. If he doesn't patch anything but submits a new version of the source, the patches will just be reapplied again, with prompts to resolve conflicts which he or others can resolve. Complicated, but I think it would work. /Matt ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] GSoC Project: A Haddock + Pandoc documentation tool
2010/4/9 Alvaro Vilanova Vidal (viator) alv...@gmail.com: 2) Integration with Haddock docs. I think it would be best if the pages generated by this system and the pages generated by Haddock would be integrated as much as possible - both style wise (sharing CSS, structure, headers, footers, sidebars etc) and also in terms of hyper linking. Identifiers in your documentation should go to the Haddock docs. It should feel as if the Haddock pages are just a part of the whole documentation structure. I agree, and I tried to represent that in the example (but I created a new css). All the documentation will be rendered by Pandoc so there is no difference between the format of each side. [...] If we want /one/ API reference (and not one generated by Pandoc and one generated by Haddock), then I think we need to generate it in XHTML format directly. There is no sensible way to transmit all the semantic information we need to be able to style and layout the different Haskell declaration through the markup languages supported by Pandoc. One can embed HTML but I suspect it's not the right solution. David ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: GSoC: Hackage 2.0
On 10:21 Fri 09 Apr , Job Vranish wrote: On Fri, Apr 9, 2010 at 9:46 AM, Ivan Lazar Miljenovic ivan.miljeno...@gmail.com wrote: Job Vranish job.vran...@gmail.com writes: I vote for adding a feature that would let people post comments/code snippets to the documentation of other peoples packages :) You mean turn every hackage project page into a mini wiki? Yep. My worry with this is that users will fill carefully written documentation with irreverent nonsense or, worse, factual errors. Moderation seems necessary. It would be even nicer if you could post comments to individual haskell definitions on the haddock page, and then hide most of them by default under an expander of some sort. Rather than, you know, providing the maintainer with a patch with some improved documentation? This is often more difficult than it sounds. The biggest obstacle to this approach is that a new hackage version of the package must to be uploaded to update the documentation and the authors (me included) tend to prefer to push new packages only when there are significant changes. It seems to me that the solution to this particular problem is to allow package maintainers to publish updated documentation separately from new packages. Steps involved currently: 0. pull down package source to build manually 1. add documentation/code snippet to source 2. build haddock documentation 3. debug bad/ugly syntax / missing line breaks that break haddock 4. generate a patch 5. email patch to author 6. wait a week for the author to actually get around to applying the patch to whatever repository the source resides 7. wait several weeks for the author to release the next version of the package I suspect that most maintainers are amenable to simple emails containing change requests where documentation is concerned (please change the first sentence of bazify's documentation to ...), which means you can skip steps 0 through 4. Steps involved with mini wiki: 0. add [code] [/code] tags (or whatever) 1. copy 2. paste 3. submit I think making this process easier would greatly increase the community involvement in the generation of documentation and improve the quality of the documentation as a whole. I would imaging that this would not be a trivial task, but I think even something super simple (like what they have for the php documentation) would be much better than nothing. PHP's comments are a fine example of what I *don't* want to see polluting my documentation. There is very little signal to be found amongst that noise. On Fri, Apr 9, 2010 at 10:46 AM, Malcolm Wallace malcolm.wall...@cs.york.ac.uk wrote: How much cooler it would be, if the wiki-like comment on Hackage could automatically be converted into a darcs/git/whatever patch, and mailed to the package author/maintainer by Hackage itself. This would indeed be awesome :) Though I think I would prefer to select, from a list of comments, which ones I would like to include, and then click the Download darcs/git/whatever patch button. (rather than get hit by emails ) The resulting patches will be patently useless, because the only sort of changes they can make is to append bullet points to existing documentation. Unless you are proposing that users can perform any change whatsoever on hackage? -- Nick Bowler, Elliptic Technologies (http://www.elliptictech.com/) ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What is the consensus about -fwarn-unused-do-bind ?
On Fri, Apr 9, 2010 at 6:44 AM, Ivan Lazar Miljenovic ivan.miljeno...@gmail.com wrote: As of 6.12.1, the new -fwarn-unused-do-bind warning is activated with -Wall. This is based off a bug report by Neil Mitchell: http://hackage.haskell.org/trac/ghc/ticket/3263 . However, does it make sense for this to be turned on with -Wall? Personally, I find it to be tremendously noisy and unhelpful, and I always edit my .cabal files to turn it off. I think of it as a usability regression. For starters, why should this warning apply only to do blocks and not to explicit usage of , etc.? That is, the following code (as specified in the above bug report) generates an error: do doesFileExist foo return 1 yet this doesn't: doesFileExist foo return 1 If monadic code is going to return an error, then shouldn't _all_ monadic code do so, and not just those in do blocks? Secondly, a fair number of packages seem to be disabling this globally; the packages I know of (from mentions on mailing lists and grepping on /srv/code/*/*.cabal on code.haskell.org) that have the -fno-warn-unused-do-bind option being passed to GHC in their .cabal file include: * HsOpenCL * leksah-server * xmonad (including xmonad-contrib) * xmobar * pandoc My reason for bringing this up is that I'm soon about to release a new version of my graphviz library, and am debating what to do. Note that most of these errors are being caused by usage of a monadic-style of parsing (before anyone tells me I should be using an Applicative style instead, polyparse doesn't support Applicative, so I can't) and as such the return value is being evaluated anyway. The way I see it, I have 4 options: 1. Do as the error suggests and preface usage of these parser combinators with _ -. 2. Use some function of type (Monad m) = m a - m () instead of doing _ -. 3. Duplicate the parser combinators in question so that I have one version that returns a value and another that does the main parser and then returns (); then use this second combinator in do blocks where I don't care about the returned value. 4. Put -fno-warn-unused-do-bind in the .cabal file. The first two options don't appeal to me as being excessive usage of boilerplate; the third involves too much code duplication. However, I am loath to just go and disable a warning globally. What does the Haskell community think? Is -fwarn-unused-do-bind a worthwhile warning (and code should be updated so as not to cause it to find anything to warn about)? Or is it more of a hindrance to be disabled? -- Ivan Lazar Miljenovic ivan.miljeno...@gmail.com IvanMiljenovic.wordpress.com ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Type constrain in instance?
You need to define the instance with `ChainableFunction' and not `CF' (the latter is value-level, not type level). Once you make that switch, you'll find yourself with an instance that is not compatible with the definition of the `Category' class. Prelude Control.Category :info Category class Category cat where id :: forall a. cat a a (.) :: forall b c a. cat b c - cat a b - cat a c -- Defined in Control.Category We see that `id' and `.' have no class constraints and that there is in fact no where to place a class constraint on `a', `b' or `c'. I think that what you're looking for are restricted categories (and restricted monads and functors, as well, perhaps). A cursory search suggests the `data-category' package. -- Jason Dusek ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Can Haskell enforce the dimension?
Hi, In C++, template can be used to enforce the dimension. For example, F=m*a is OK and F=m*t will issue a compile time error. Is there a way to do this in Haskell? Thanks, Haihua ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Can Haskell enforce the dimension?
Excerpts from Haihua's message of Fri Apr 09 12:28:23 -0400 2010: In C++, template can be used to enforce the dimension. For example, F=m*a is OK and F=m*t will issue a compile time error. Is there a way to do this in Haskell? http://hackage.haskell.org/package/dimensional Cheers, Edward ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] transliteration with Haskell iconv?
Hi all, I'd like transliterate a UTF-8 text to ASCII. With iconv(1), I would do iconv -f UTF-8 -t ASCII//TRANSLIT foo I've tried using Duncan's iconv library, but it seems to give me more question marks than iconv(1) would, which seems odd if it's the same library underneath. In other words, transliterating Aonach Mòr is a mountain in the Highlands of Scotland. On va boire un petit café si ça te dit. gives me with iconv(1) Aonach Mor is a mountain in the Highlands of Scotland. On va boire un petit cafe si ca te dit. but with my attempts at using Codec.Text.IConv and examples/hiconv -f utf-8 -t ascii --transliterate, I get Aonach M?r is a mountain in the Highlands of Scotland. On va boire un petit caf? si ?a te dit. Anybody run into this and know what to do? Thanks! PS. I'm on Ubuntu 9.04 if it makes any difference... -- Eric Kow http://www.nltg.brighton.ac.uk/home/Eric.Kow PGP Key ID: 08AC04F9 signature.asc Description: Digital signature ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Can Haskell enforce the dimension?
Hello Haihua, Friday, April 9, 2010, 8:28:23 PM, you wrote: In C++, template can be used to enforce the dimension. For example, F=m*a is OK and F=m*t will issue a compile time error. Is there a way to do this in Haskell? yes. but standard * operation has type t-t-t. so you need either to use other operation or don't import standard Num class -- Best regards, Bulatmailto:bulat.zigans...@gmail.com ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Re: transliteration with Haskell iconv?
On Fri, Apr 09, 2010 at 17:40:37 +0100, Eric Kow wrote: I'd like transliterate a UTF-8 text to ASCII. but with my attempts at using Codec.Text.IConv and examples/hiconv -f utf-8 -t ascii --transliterate, I get Aonach M?r is a mountain in the Highlands of Scotland. On va boire un petit caf? si ?a te dit. I had completely neglected to Google before writing my email (sigh) It seems like this Gnome hacker was suffering the same thing I was http://taschenorakel.de/mathias/2007/11/06/iconv-transliterations/ Hmm... -- Eric Kow http://www.nltg.brighton.ac.uk/home/Eric.Kow PGP Key ID: 08AC04F9 signature.asc Description: Digital signature ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Type constrain in instance?
On Fri, Apr 9, 2010 at 11:22 AM, Louis Zhuang louis.zhu...@acm.org wrote: However GHC only has kinds for class/instance like (*-*-*) so we are forced to allow all possible types in instance code. I'm not sure if I'm modelling things correctly or is there another way to do the same thing? As far as I know, it is indeed not generally possible to constrain unmentioned parameters to a type constructor in an instance declaration. There are workarounds involving modifications to the class definition, but as you want to use a class from the standard libraries, that helps very little. For the most part this is by design; the standard type classes are intended to be fully generic in their parameters. However, it seems that you are creating your own special-purpose data type, so one possible solution presents itself: Constrain the ChainableFunction type to permit construction only with numeric types. Simply placing a Num context on the data declaration fails, however, as this demands a similar constraint on functions using the type--which is exactly what we're trying to achieve, and so is spectacularly useless. One conventional solution is to instead conceal the actual data constructor from client code, instead exposing only constructor/deconstructor functions with appropriate constraints; the downsides to this are that client code cannot use pattern matching on the type, and that internal code must carefully maintain constraints anywhere the type is used. A more aesthetically appealing approach, if you're not averse to language extensions, is GADTs: Place constraints on the CF data constructor, not the type. With CF the sole constructor the constraints will be enforced everywhere, and best of all pattern matching on CF will provide the necessary context--making the constraint visible even inside the instance declaration for the supposedly fully-generic (.)! Alas, it now seems difficult to describe id; it must create a ChainableFunction with any type (as per the Category class), and without pattern matching on a ChainableFunction argument it has no way of getting the constraints. But consider that, for the same reasons, id has no way of actually doing anything with the type parameters with which it must construct a ChainableFunction, and thus shouldn't need to have them at all; further, the semantics of id are quite simple and essentially independent of its parameterized content. Thus, we can add another constructor to ChainableFunction, that takes no arguments and constructs a value of type (ChainableFunction a a), and extend the definition of (.) to make the behavior of the identity explicit. The result will look something like this: {-# LANGUAGE MultiParamTypeClasses, GADTs #-} import qualified Control.Category as Cat data ChainableFunction a b where CF :: (Num a, Num b) = (a-b) - (a-b) - ChainableFunction a b CFId :: ChainableFunction a a instance Cat.Category ChainableFunction where id = CFId CF g g' . CF f f' = CF (g.f) (\a - f' a * g' (f a)) CFId . f = f g . CFId = g You've probably noticed that I've been ignoring the Module class. Unfortunately, the solution thus far is insufficient; a Module constraint on the CF constructor does work as expected, providing a context with (Module a b, Module b c), but the result requires an instance for Module a c, which neither available, nor easily obtained. I'm not sure how best to handle that issue; if you find the rest of this useful, hopefully it will have given you enough of a start to build a complete solution. - C. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Difficulties installing LLVM bindings
Hi Haskell-Cafe, I can't get the LLVM bindings for Haskell to install. Does anyone know what I might need to do? Has anyone seen this error before? Here's the problem: (Installing from latest darcs source) llvm-haskell aran$ cabal install Resolving dependencies... ...snip... checking for unistd.h... yes checking llvm-c/Core.h usability... yes checking llvm-c/Core.h presence... yes checking for llvm-c/Core.h... yes checking for LLVMModuleCreateWithName in -lLLVMCore... no configure: error: could not find LLVM C bindings cabal: Error: some packages failed to install: llvm-0.7.0.1 failed during the configure step. The exception was: exit: ExitFailure 1 I've got the latest LLVM (from source) installed using the default sudo make install, i.e. into /usr/local. Using cabal install --configure-option --with-llvm-prefix=/usr/local doesn't change the result. LLVM is indeed based in /usr/local. The binaries are in /usr/local/bin/, libLLVM*.a is in /usr/local/lib, etc. Google just shows that this error has cropped up a couple times before. I tried manually disabling the check in the configure script but then generating code fails with massive run-time errors that look like link problems. I'm on an up-to-date Snow Leopard, using the latest llvm bindings from darcs, cabal-install 0.8.0, cabal library 1.8.0.2, ghc 6.12.1, and LLVM from svn. LLVM works fine from C++, at least. I worked through the Kaleidoscope tutorial and a few hand-coded .ll files without a hitch. Have you seen anything like this before? Any tips or things to try? I can't think of what magic setting I'm missing. Thanks, Aran ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: GSoC: Hackage 2.0
Hello all I support the immediate aims of Matthew Gruen's proposal and hope it gets adopted as a SoC project, but for the 'web2.0' aspects I largely agree with Nick Bowler. As a package author, checking disparate wiki pages to help people is more work than answering emails. Also, if people have install problems with (other peoples) packages on Windows and post a problem report to the Cafe, I'll often have a look to see if I can help as it isn't much trouble provided the dependency depth is near 0. I certainly won't be able to do that if there is a switch over to wiki for reporting problems. Best wishes Stephen ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] GSoC Project: A Haddock + Pandoc documentation tool
3) Configuration I haven't looked at this yet but I suspect people will not want another configuration file in their projects. Perhaps you could propose some kind of Cabal integration instead. It would be a shame if I had to figure out how to write a cabal file for my project and replace the entire build system, if even possible, just to get a documentation tool working. But if it's like haddock and can be configured just with flags and cabal integration just means cabal calling it for you, then there's no problem. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] GSoC Project: A Haddock + Pandoc documentation tool
Just a FYI, there's currently an ongoing discussion in the web-devel mailing list about the woeful state of documentation in the world of Haskell web development, and Haskell in general. It seems like some good might come of combining these two discussions as they both seem to be heading in a similar direction. At the very least those of us not in both the web-devel and haskell-cafe mailing lists might want to check the other one out. -R. Kyle Murphy -- Curiosity was framed, Ignorance killed the cat. On Fri, Apr 9, 2010 at 14:12, Evan Laforge qdun...@gmail.com wrote: 3) Configuration I haven't looked at this yet but I suspect people will not want another configuration file in their projects. Perhaps you could propose some kind of Cabal integration instead. It would be a shame if I had to figure out how to write a cabal file for my project and replace the entire build system, if even possible, just to get a documentation tool working. But if it's like haddock and can be configured just with flags and cabal integration just means cabal calling it for you, then there's no problem. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Re: Type constrain in instance?
Casey McCann syntaxglitch at gmail.com writes: {-# LANGUAGE MultiParamTypeClasses, GADTs #-} import qualified Control.Category as Cat data ChainableFunction a b where CF :: (Num a, Num b) = (a-b) - (a-b) - ChainableFunction a b CFId :: ChainableFunction a a instance Cat.Category ChainableFunction where id = CFId CF g g' . CF f f' = CF (g.f) (\a - f' a * g' (f a)) CFId . f = f g . CFId = g You've probably noticed that I've been ignoring the Module class. Unfortunately, the solution thus far is insufficient; a Module constraint on the CF constructor does work as expected, providing a context with (Module a b, Module b c), but the result requires an instance for Module a c, which neither available, nor easily obtained. I'm not sure how best to handle that issue; if you find the rest of this useful, hopefully it will have given you enough of a start to build a complete solution. - C. Thanks for the comment. If we try to use GADT to construct Cat.id, actually (Numa) constraint is redundant because I just want 1 for first derivative of x. However instance (Module a b, Module b c) = Module a c is a must for chain rule... I'm looking at Data.Category suggested by Jason, because it allows subset of Hask object to be applied into parameters ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] ANN: darcsum.el 1.2 released
Simon Michael si...@joyful.com writes: Unfortunately it will still hang if darcs emits something it can't parse. An emacs process-filter is used to drive an interactive darcs command, like an expect script. The process-filter can receive darcs output in random chunks, so it's hard to distinguish a parse failure from partial output, unless the output has a recognisable prefix, which some of it does not (eg darcs amend's.) I'm not sure what a more robust way to drive darcs looks like. Maybe i'm not understanding the problem, but cannot you just accumulate the output in an auxiliary variable and parse the ouput as a whole once the darcs process finishes? Thanks a lot for working on darcsum, btw! jao -- There are two ways to write error-free programs; only the third one works. - Alan Perlis, Epigrams in Programing ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Re: ANN: darcsum.el 1.2 released
On 4/9/10 11:48 AM, Jose A. Ortega Ruiz wrote: Maybe i'm not understanding the problem, but cannot you just accumulate the output in an auxiliary variable and parse the ouput as a whole once the darcs process finishes? I think no, because it is driving darcs interactively to select hunks - it needs to parse a bit, answer y/n, parse some more... ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] What is the consensus about -fwarn-unused-do-bind ?
On Fri, Apr 09, 2010 at 09:07:29AM -0700, Bryan O'Sullivan wrote: On Fri, Apr 9, 2010 at 6:44 AM, Ivan Lazar Miljenovic ivan.miljeno...@gmail.com wrote: As of 6.12.1, the new -fwarn-unused-do-bind warning is activated with -Wall. This is based off a bug report by Neil Mitchell: http://hackage.haskell.org/trac/ghc/ticket/3263 . However, does it make sense for this to be turned on with -Wall? Personally, I find it to be tremendously noisy and unhelpful, and I always edit my .cabal files to turn it off. I think of it as a usability regression. Well, I would say it could be helpful, but given that even Text.Printf.printf triggers this error in harmless statements it is indeed a regression. iustin ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] persist and retrieve of IO type?
Is there a way to persist a [IO ()] to say a file then retrieve it later and execute it using a sequence function? Thanks, Daryoush ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] mapM for vectors
Don Stewart schrieb: alexey.skladnoy: Hello I found that there is no monadic map for vector. It's possible to define to define such map using conversion to list, but I suppose it's not efficient. I didn't make any measurements. mapM' :: Monad m = (a - m b) - V.Vector a - m (V.Vector b) mapM' f = return . V.fromList = mapM f . V.toList Any suggestions about implementation of such function? Specifically I want to use Random monad. There's a tutorial here on usving vectors, http://haskell.org/haskellwiki/Numeric_Haskell:_A_Vector_Tutorial#Random_numbers mapM is available via Fusion.Stream.Monadic.mapM But can it be efficient? It must handle cases like m = [] and thus creation of the (V.Vector b) result means a lot of copying, right? ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Can Haskell enforce the dimension?
On Sat, 10 Apr 2010, Haihua wrote: Hi, In C++, template can be used to enforce the dimension. For example, F=m*a is OK and F=m*t will issue a compile time error. http://www.haskell.org/haskellwiki/Libraries_and_tools/Mathematics#Physical_units ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Difficulties installing LLVM bindings
On Fri, Apr 09, 2010 at 01:38:21PM -0400, Aran Donohue wrote: Have you seen anything like this before? Any tips or things to try? I can't think of what magic setting I'm missing. Do you have llvm-config on your path? $ llvm-config I have in the past installed the LLVM bindings with a $HOME installation of LLVM, so /usr/local shouldn't be a problem. HTH, -- Felipe. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] GSoC Project: A Haddock + Pandoc documentation tool
2010/4/9 Alvaro Vilanova Vidal (viator) alv...@gmail.com: 3) Configuration I haven't looked at this yet but I suspect people will not want another configuration file in their projects. Perhaps you could propose some kind of Cabal integration instead. That would be awesome. I have to see how Cabal handles these files. Here's the relevant Cabal ticket: http://hackage.haskell.org/trac/hackage/ticket/330 David ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Difficulties installing LLVM bindings
On 9 April 2010 18:38, Aran Donohue aran.dono...@gmail.com wrote: Hi Haskell-Cafe, I can't get the LLVM bindings for Haskell to install. Does anyone know what I might need to do? Has anyone seen this error before? Here's the problem: (Installing from latest darcs source) I just tried this on my Mac and got the same problem. The problem is described in config.log: configure:3659: g++ -o conftest -g -O2 -I/usr/local/include -D_DEBUG -D_GNU_SOURCE -D__STDC_LIMIT_MACROS -D__STDC_CONSTANT_MACROS -m32 -L/usr/local/lib -lpthread -lmconftest.c -lLLVMCore -lLLVMSupport -lLLVMSystem 5 ld: warning: in /usr/local/lib/libLLVMCore.a, file is not of required architecture ld: warning: in /usr/local/lib/libLLVMSupport.a, file is not of required architecture ld: warning: in /usr/local/lib/libLLVMSystem.a, file is not of required architecture Undefined symbols: _LLVMModuleCreateWithName, referenced from: _main in cc5D6X0z.o So perhaps LLVM needs to be built universal or something? Cheers, Max ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Difficulties installing LLVM bindings
Bryan said a while ago that Manuel Chakravarty had some Mac related patches for LLVM, don't know if they have been integrated yet. On 9 April 2010 23:11, Max Bolingbroke batterseapo...@hotmail.com wrote: On 9 April 2010 18:38, Aran Donohue aran.dono...@gmail.com wrote: Hi Haskell-Cafe, I can't get the LLVM bindings for Haskell to install. Does anyone know what I might need to do? Has anyone seen this error before? Here's the problem: (Installing from latest darcs source) I just tried this on my Mac and got the same problem. The problem is described in config.log: configure:3659: g++ -o conftest -g -O2 -I/usr/local/include -D_DEBUG -D_GNU_SOURCE -D__STDC_LIMIT_MACROS -D__STDC_CONSTANT_MACROS -m32 -L/usr/local/lib -lpthread -lm conftest.c -lLLVMCore -lLLVMSupport -lLLVMSystem 5 ld: warning: in /usr/local/lib/libLLVMCore.a, file is not of required architecture ld: warning: in /usr/local/lib/libLLVMSupport.a, file is not of required architecture ld: warning: in /usr/local/lib/libLLVMSystem.a, file is not of required architecture Undefined symbols: _LLVMModuleCreateWithName, referenced from: _main in cc5D6X0z.o So perhaps LLVM needs to be built universal or something? Cheers, Max ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe -- Push the envelope. Watch it bend. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] ANN: shelltestrunner 0.8 released
I'm pleased to announce a new release of shelltestrunner, a tool which aims to make testing command-line programs easy. Thanks to Bernie Pope for contributing features and valuable feedback. Example: $ cabal install shelltestrunner ... $ cat - a.test # a simple test - run cat, provide stdin, check stdout cat 1 2 1 2 $ shelltest a.test a.test: [OK] Test Cases Total Passed 1 1 Failed 0 0 Total 1 1 Release notes: 2010/4/9 0.8 * rename executable to shelltest. The package might also be renamed at some point. * better built-in help * shell tests now include a full command line, making them more readable and self-contained. The --with option can be used to replace the first word with something else, unless the test command line begins with a space. * we also accept directory arguments, searching for test files below them, with two new options: --execdirexecute tested command in same directory as test file --extension=EXT file extension of test files (default=.test) home: http://hackage.haskell.org/package/shelltestrunner repo: http://joyful.com/repos/shelltestrunner ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] ANN: darcsum.el 1.2 released
Simon Michael si...@joyful.com writes: With Christian's blessing, I have taken over maintenance of darcsum and would like to announce the 1.2 release: darcs get http://joyful.com/repos/darcsum -t 1.2 Is it possible to get an actual release version of this put somewhere rather than having to get a darcs repo? -- Ivan Lazar Miljenovic ivan.miljeno...@gmail.com IvanMiljenovic.wordpress.com ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] ANN: hledger 0.9 released
I'm pleased to announce a new hledger release, with many bugfixes and small improvements, GHC 6.12 support, and a separate library package to make building (h)ledger-compatible tools easier. Thanks to Oliver Braun and Gwern Branwen for code contributions this release. Just in time for tax filing! :) Patches, feedback, contributions welcome. Best, -Simon home: http://hledger.org Release notes: 2010/04/10 hledger 0.9 .. * ghc 6.12 support * split off hledger-lib package, containing core types utils * parsing: ignore D, C, N, tag, end tag directives; we should now accept any ledger 2.6 file * parsing: allow numbers in commodities if double-quoted, like ledger * parsing: allow transactions with empty descriptions * parsing: show a better error for illegal month/day numbers in dates * parsing: don't ignore trailing junk in a smart date, eg in web add form * parsing: don't ignore unparsed text following an amount * parsing: @ was being treated as a currency symbol * add: fix precision handling in default amounts (#19) * add: elide last amount in added transactions * convert: keep original description by default, allow backreferences in replace pattern * convert: basic csv file checking, warn instead of dying when it looks wrong * convert: allow blank/comment lines at end of rules file * print: always show zero amounts as 0, hiding any commodity/ decimal places/price, like ledger * register: fix bad layout with years 1000 * register: fix a Prelude.head error with reporting interval, -- empty, and --depth * register: fix a regression, register should not show posting comments * register: with --empty, intervals should continue to ends of the specified period * stats: better output when last transaction is in the future * stats: show commodity symbols, account tree depth, reorder slightly * web: -fweb now builds with simpleserver; to get happstack, use - fwebhappstack instead * web: pre-fill the add form with today's date * web: help links, better search form wording * web: show a proper error for a bad date in add form (#17) * web: fix for unicode search form values * web: fix stack overflow caused by regexpr, and handle requests faster (#14) * web: look for more-generic browser executables * web: more robust browser starting (#6) * error message cleanups * more tests, refactoring, docs Stats: 58 days, 2 contributors, 102 commits since last release. Now at 3983 lines of non-test code, 139 tests, 53% coverage. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Re: GSoC: Hackage 2.0
The proposal as I submitted it is here: http://docs.google.com/View?docid=0Afa5MxwyB_zYZGhjanNrdjNfMjkzZjloOWNienYpageview=1hgd=1hl=en And it might need further revision as I talk to Duncan and the community. The advanced social features wouldn't get deployed by the end of the summer, but by that point the project would have structural integrity enough to write and release the features relatively easily, as well as act as a web service for their functionality. About concerns of excessive Web 2.0-ness: I agree that some of the features should be optional, particularly ones which duplicate other formal and informal bug-tracking systems. Ones which provide information to rank and navigate packages effectively are more essential. There's a lot to learn from other projects with similar systems and Haskellers' experience with those. Cheers, Matt Gruen ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Difficulties installing LLVM bindings
Thomas---The new Mac patches have been integrated in the version I'm using, according to the darcs log. Thanks Felipe---indeed, llvm-config is on the path. Max---I had the same realization about config.log. I managed to get past it by forcing the Haskell-LLVM build into 64-bit mode, but that led to new problems with ghc-pkg. I think we need to rebuild LLVM forcing 32-mode, but I haven't yet found the results of this. I'd love to hear what you did if you manage to get it going and compile programs. Thanks, Aran On Fri, Apr 9, 2010 at 7:07 PM, Thomas Schilling nomin...@googlemail.comwrote: Bryan said a while ago that Manuel Chakravarty had some Mac related patches for LLVM, don't know if they have been integrated yet. On 9 April 2010 23:11, Max Bolingbroke batterseapo...@hotmail.com wrote: On 9 April 2010 18:38, Aran Donohue aran.dono...@gmail.com wrote: Hi Haskell-Cafe, I can't get the LLVM bindings for Haskell to install. Does anyone know what I might need to do? Has anyone seen this error before? Here's the problem: (Installing from latest darcs source) I just tried this on my Mac and got the same problem. The problem is described in config.log: configure:3659: g++ -o conftest -g -O2 -I/usr/local/include -D_DEBUG -D_GNU_SOURCE -D__STDC_LIMIT_MACROS -D__STDC_CONSTANT_MACROS -m32 -L/usr/local/lib -lpthread -lmconftest.c -lLLVMCore -lLLVMSupport -lLLVMSystem 5 ld: warning: in /usr/local/lib/libLLVMCore.a, file is not of required architecture ld: warning: in /usr/local/lib/libLLVMSupport.a, file is not of required architecture ld: warning: in /usr/local/lib/libLLVMSystem.a, file is not of required architecture Undefined symbols: _LLVMModuleCreateWithName, referenced from: _main in cc5D6X0z.o So perhaps LLVM needs to be built universal or something? Cheers, Max ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe -- Push the envelope. Watch it bend. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: GSoC: Hackage 2.0
wikigracenotes: The proposal as I submitted it is here: http://docs.google.com/View?docid=0Afa5MxwyB_zYZGhjanNrdjNfMjkzZjloOWNienYpageview=1hgd=1hl=en And it might need further revision as I talk to Duncan and the community. The advanced social features wouldn't get deployed by the end of the summer, but by that point the project would have structural integrity enough to write and release the features relatively easily, as well as act as a web service for their functionality. About concerns of excessive Web 2.0-ness: I agree that some of the features should be optional, particularly ones which duplicate other formal and informal bug-tracking systems. Ones which provide information to rank and navigate packages effectively are more essential. There's a lot to learn from other projects with similar systems and Haskellers' experience with those. Very pragmatic! Thanks for submitting a proposal! ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe