Re: [Haskell-cafe] about Haskell code written to be too smart

2009-03-26 Thread Colin Adams
2009/3/25 wren ng thornton w...@freegeek.org:
  Most of the documentation is in research papers, and a normal
  programmer don't want to read these papers.

 Yes, and no. There is quite a bit of documentation in research papers, and
 mainstream programmers don't read research. However, this is a big part of
 what makes the Haskell community what it is. There are plenty of
 non-academics here, but they have the willingness to read these papers (even
 if it's out of the ordinary) and the desire to learn radical new things
 (because they're out of the ordinary).

Yes.
BUT ...

when I look up the Haddock-generated documentation for a function, I
DON'T appreciate it if that is in the form of a hyperlink to a
research paper.
And that occurs in several of the libraries shipped with GHC for instance.

A reference to a research paper is fine to show where the ideas came
from, but that is not where the library documentation should be.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: about Haskell code written to be too smart

2009-03-26 Thread Peter Verswyvelen
After reading the chapter about parsers in Bird's book, I tried to implement
a simple parser myself, and this was a great experience, a real eye opener
on how declarative and composable Haskell can be. Haskell is... well magic
:-) It gave me same kind of joy I had when I made my first moving sprite on
the Commodore 64 in 1985.
On Thu, Mar 26, 2009 at 12:44 AM, Heinrich Apfelmus 
apfel...@quantentunnel.de wrote:

 Manlio Perillo wrote:
  Heinrich Apfelmus ha scritto:
 
  I think you'd have had a much easier time by starting with a proper book
  right away, like Richard Bird's Introduction to Functional Programming
  in Haskell, accompanied by Real World Haskell.
 
  Unfortunately, one year ago Real World Haskell was not here.
  And note that I have no problems with basic functional programming
  concepts.
  My problems are specific to Haskell.

 Despite the title, Bird's book is quite specific to Haskell, in
 particular concerning the philosophy of composing solutions from
 building blocks as opposed to primitive recursion.

 I'd say that every serious Haskell programmer should have it on his
 bookshelf (even if only for show ;) ).


 Regards,
 apfelmus

 --
 http://apfelmus.nfshost.com

 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Fwd: [Haskell-cafe] Re: ANNOUNCE: WinGhci, a GUI for GHCI on Windows

2009-03-26 Thread Pepe Gallardo
-- Forwarded message --
From: Pepe Gallardo pepe.hask...@gmail.com
Date: 2009/3/25
Subject: Re: [Haskell-cafe] Re: ANNOUNCE: WinGhci, a GUI for GHCI on Windows
To: Benjamin L.Russell dekudekup...@yahoo.com



Hi Benjamin*,*
**
The only requirement to run WinGhci is to have GHC installed and to have GHC
bin directory on your path. The way to start the application is by executing
winghci.exe. Install.exe is an small program that associates .hs files with
WinGhci (it should be run from the same directory where WinGhci is
installed). Note that this association is optional. I have added a wiki page
on this.

2009/3/23 Benjamin L.Russell dekudekup...@yahoo.com

 Just another couple of thoughts for possible additional improvement:

 1.  It would be even nicer if WinGhci added a menu entry to the
 Start menu automatically, as WinHugs does.

 2.  For the proposed menu entry, it would also probably be a good idea
 if WinGhci added a folder for that menu entry, and included a link to
 a README file in there, as WinHugs does also.

 3.  In order to distinguish WinGhci from GHCi, it might also be
 helpful if WinGhci had a different icon; the current WinGhci icon is
 identical to the one for GHCi (perhaps use the forthcoming official
 Haskell logo here?).



WinGhci icon is different (although quite similar) from GHCi icon, at least
for medium to large icon sizes.



 Just my two cents for now

 -- Benjamin L. Russell
 --
 Benjamin L. Russell  /   DekuDekuplex at Yahoo dot com
 http://dekudekuplex.wordpress.com/
 Translator/Interpreter / Mobile:  +011 81 80-3603-6725
 Furuike ya, kawazu tobikomu mizu no oto.
 -- Matsuo Basho^

 ___
 Haskell-Cafe mailing list
 Haskell Haskell-Cafe@haskell.org-c...@haskell.orgHaskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe
 Haskell-Cafe@haskell.org

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Fwd: [Haskell-cafe] Re: ANNOUNCE: WinGhci, a GUI for GHCI on Windows

2009-03-26 Thread Pepe Gallardo
2009/3/23 Andrew Butterfield andrew.butterfi...@cs.tcd.ie

 Benjamin L.Russell wrote:

 This is wonderful--just what I was waiting for!  The application looks
 beautiful, and I'm very happy that GHCi now has a matching GUI
 application along the lines of WinHugs.


 Indeed - me too !

 It would be even better if you could provide some
 installation/uninstallation information.  I unzipped the contents of
 WinGhci-1.0-bin.zip into the C:\Documents and Settings\username\My
 Documents\ folder, but there was no README file.


 And version information - I tried it with GHC 6.4 and it died (Not
 Responding)

 What version of GHCi does it require?

 And no, I won't upgrade GHC just yet (this is the latest GHC/wxHaskell
 combo that works for me with GHCi...)





I have tested it with GHC 6.10.1. The latest GHC version that I have
currently installed is GHC 6.6, and it seems to work too. Do you have GHC
bin directory on your path?



 --
 
 Andrew Butterfield Tel: +353-1-896-2517 Fax: +353-1-677-2204
 Foundations and Methods Research Group Director.
 School of Computer Science and Statistics,
 Room F.13, O'Reilly Institute, Trinity College, University of Dublin
   http://www.cs.tcd.ie/Andrew.Butterfield/
 


 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] about Haskell code written to be too smart

2009-03-26 Thread Loup Vaillant
2009/3/26 Thomas Hartman tphya...@gmail.com:
 Beginner list processing code can and often does go awry when presented with 
 infinite lists.

 I didn't mean code that a beginner would write, I mean code that would
 be easy to understand for a beginner to read

For that, in this particular example, a type signature, would make the
function more than readable.

 -- that is, explicit
 pattern matching, explicit recursion, no gratuitous use of state
 monad.

 […]

 What I like about the pattern matching is the totality -- you see all
 the possible inputs, and you see what happens.

What I read here is make the operational semantics more explicit. Do
you mean it? Personally, I see explicit operational semantics as a
last resort, not as a facilitator. Most of the time, I care about what
*is*, not what happens. Pattern matching (compared to function
composition) is not easier for beginners. It is easier for seasoned
imperative programmers, because otherwise, they have to reformat their
brain. In my opinion.

Now imagine you have to write your function as a solution to a math
assignment. Imagine that fold, map, zip, and the like are usual
functions (usual like sinus, exp, ln…). Of course, you have to prove
your solution correct.

Do you prefer a mere composition of four usual functions (possibly in
pointed notation), or do you prefer a recursive definition? Back in
high school, composition of functions (or nested formulaes) was easy.
Recursive definitions were the advanced stuff. If you want the utmost
confidence about the correctness of your function, I think you want to
reason about the function composition.



 With the state version, there's a lot of behind-the-scenes magic, and
 as we've seen, things can go wrong.

Well, about that, I cannot talk (I'm still a beginner).

Loup
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: about Haskell code written to be too smart

2009-03-26 Thread John Lato
 From: wren ng thornton w...@freegeek.org
 Dan Weston wrote:
 So to be clear with the terminology:

 inductive   = good consumer?
 coinductive = good producer?

 So fusion should be possible (automatically? or do I need a GHC rule?) with
   inductive . coinductive

 Or have I bungled it?

 Not quite. Induction means starting from base cases and building things
 upwards from those. Coinduction is the dual and can be thought of as
 starting from the ceiling and building your way downwards (until you hit
 the base cases, or possibly forever).


( quite a lot of text trimmed for brevity)

Wren, thank you for contributing this post.  Coming from the point of
view of someone who doesn't grok it all yet, this is the best
commentary I've read on deforestation/fusion, and in fact is helpful
for category theory newbies as well (at least I found it so).  It
really should go on a wiki somewhere.

John Lato
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Exception handling in numeric computations

2009-03-26 Thread John Lato
On Wed, Mar 25, 2009 at 7:02 PM, Gregory Petrosyan
gregory.petros...@gmail.com wrote:
 First of all, thanks everybody for the discussion -- very interesting to read!

 Apologies if this is off-topic a bit.

 While reading, I have a feeling that proposed solutions are somewhat similar
 to checked exceptions. And IMO they turned out to be more harmful than useful.

Do you mean checked exceptions, or the proposed solutions?  IMO the
problem with checked exceptions is that they conflate two ideas by
trying to use the existing exception framework to do something that
can't otherwise be done by the language.  This leads to something that
doesn't work well for either use.

Languages with checked exceptions usually use them for two purposes:

1.  Exceptional conditions - disk full, CPU on fire, etc.
2.  Error handling - invalid arguments to a function, attempt to
invert non-invertible matrix, etc.

In the first case, checked exceptions are a pain because there are so
many different possible exceptional conditions to enumerate.  Or
alternatively you have an IOException that can be one of hundreds of
actual exceptional conditions.  This is exactly what regular
exceptions are for, and in general there's little that can be done
except terminate the program, except for certain special cases
explicitly handled by the programmer.

For the second case, checked exceptions actually work okay IMO.  The
syntax is a bit clunky, but they serve the purpose.  That is, they
specifically indicate what functions may fail and the failure mode.
They also allow calling code to either handle it there or pass the
error up to a higher level as appropriate.  The syntax is usually
ugly, though.

Combining these two functions into one tool gives suboptimal results.
If you're using checked exceptions for error handling, then you also
end up using them for exception handling.  That means you have
IOException thrown by just about everything (and in theory any
function in an imperative language could have an IOException due to
hardware fault), leading to complex exception handlers with a half
dozen case statements.

There are a few reasons why Haskell's approach is (or at least can be)
different.  Compare the error handling models.  Checked exceptions are
bolted on to an existing tool attempting to make it serve a different
purpose.  It seems like a good idea, but ends up being painful because
the two uses are at cross purposes.  Instead of using a specific
implementation and trying to alter it for another use, Haskell uses a
very general facility, the type system, to solve one problem, and
keeps the exception handling facility separate.

Haskell's syntax for error handling is also much nicer.  Nobody likes
using methods that could possibly throw a dozen different exceptions,
some of which are IOException and some are not.

Actually, Haskell IO makes a big difference to this.  Conceptually,
you could think of Haskell functions as purely mathematical
constructs.  In particular, they can't interact with the physical
world.  This is true even for the IO monad.  Exceptional conditions
only arise when the Haskell runtime actually tries to *evaluate* an IO
function.  As a consequence exceptional conditions only arise as part
of IO actions, and can only be handled by IO actions (because they're
interacting with the run-time environment).  This distinction makes it
possible to enforce a separation between exceptions and error
conditions, which is generally not possible in imperative languages
where a function could fail either for a computational reason (divide
by 0) or a hardware/runtime fault.

Even in Haskell this separation isn't absolute.  Programmer errors,
such as dividing by 0, can and do lead to exceptional conditions.  The
proper way to handle dividing by 0 is to not do it in the first place,
but if it happens because of a programming error, you've got an
exception.  Unfortunately this encourages programmers to think that
handling the exception is the proper way to deal with this condition,
but it isn't.

I've only recently come around to the camp of treating exception
handling and errors separately, so some of these thoughts may be a bit
loose for the moment.  In particular my thoughts from the above
paragraph have only recently become clear.

Henning T., FYI your constant advocacy has gotten at least one person
around to this view.

Cheers,
John Lato
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Use unsafePerformIO to catch Exception?

2009-03-26 Thread Jules Bean

wren ng thornton wrote:
I have long been disappointed by a number of `error`s which shouldn't 
be. For example, the fact that `head` and `div` are not total strikes me 
as a (solvable) weakness of type checking, rather than things that 
should occur as programmer errors/exceptions at runtime. The use of 
`error` in these cases muddies the waters and leads to a laissez-faire 
attitude about when it's acceptable to throw one's hands up in despair 
and use `error` rather than writing a better function.


head uses error in precisely the correct, intended fashion.

head has a precondition (only call on non-empty lists) and the error 
is just there to give you some hint that you made a mistake - got the 
precondition wrong.


It's certainly not intended to be catchable. And that is a correct use 
of error.


There are programming styles which avoid using 'head'. You are free to 
use those if you don't like it. I myself am content to use 'head' on 
lists which I know are guaranteed to be non-empty.


Jules
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] about Haskell code written to be too smart

2009-03-26 Thread Claus Reinke

Continuing our adventures into stylistic and semantic differences:-)

Comparing the 'State' and explicit recursion versions

   takeListSt = evalState . mapM (State . splitAt)

   -- ..with a derivation leading to..

   takeListSt []s = []
   takeListSt (h:t) s = x : takeListSt t s'
 where (x,s') = splitAt h s

instead of

   takeList [] _ =  []
   takeList _ [] =  []
   takeList (n : ns) xs  =  head : takeList ns tail
   where (head, tail) = splitAt n xs

we can see some differences, leading to different functions:

   *Main null $ takeListSt [1] undefined
   False
   *Main null $ takeList [1] undefined
   *** Exception: Prelude.undefined
   *Main takeList [0] []
   []
   *Main takeListSt [0] []
   [[]]

and similarly for the 'scanl' version

   takeListSc ns xs = zipWith take ns $ init $ scanl (flip drop) xs ns

Depending on usage, these differences might not matter, but what if
we want these different styles to lead to the same function, with only
stylistic and no semantic differences, taking the explicit recursion as
our spec?

In the 'State' version, the issue is that 'mapM' does not terminate
early, while the specification requires an empty list whenever 'xs'
(the state) is empty. Following the derivation at

http://www.haskell.org/pipermail/haskell-cafe/2009-March/058603.html

the first step where we have a handle on that is after unfolding
'sequence':

   takeListSt = evalState . foldr k (return []) . map (State . splitAt)
 where k m m' = do x-m; xs-m'; return (x:xs)

If we change that to

   takeListSt' = evalState . foldr k (return []) . map (State . splitAt)
 where k m m'= cutNull $ do x-m; xs-m'; return (x:xs)
   cutNull m = do s-get; if null s then return [] else m

and continue with the modified derivation, we should end up with
the right spec (I haven't done this, so you should check!-). This
isn't all that elegant any more, but support for 'mapM' with early
exit isn't all that uncommon a need, either, so one might expect
a 'mapM' variant that takes a 'cut' parameter to make it into the
libraries.

For the 'scanl' version, we have a more direct handle on the issue:
we can simply drop the offending extras from the 'scanl' result,
replacing 'init' with 'takeWhile (not.null)':

   takeListSc' ns xs = zipWith take ns $ takeWhile (not.null) $ scanl (flip 
drop) xs ns

A somewhat abbreviated derivation at the end of this message
seems to confirm that this matches the spec (as usual with proofs,
writing them down doesn't mean that they are correct, but that
readers can check whether they are).

(btw, both 'takeListSt'' and 'takeListSc'' pass Thomas' 'testP', as does
his 'partitions', but 'partitions' is not the same function as 'takeList':
consider 'null $ takeList [1] undefined' and 'takeList [0] []' ;-)

Someone suggested using 'mapAccumL' instead of 'State', and
that does indeed work, only that everything is the wrong way round:

   takeListMAL = (snd.) . flip (mapAccumL (((sndfst).).(flip splitAt)))

This is an example where all the cleverness is spent on the
irrelevant details, giving them way too much importance. So one
might prefer a version that more clearly says that this is mostly
'mapAccumL splitAt', with some administratory complications
that might be ignored on cursory inspection:

   takeListMAL' = mapAccumL' splitAt'
 where splitAt' l n   = swap $ splitAt n l
   mapAccumL' f l acc = snd $ mapAccumL f acc l
   swap (x,y) = (y,x)

Of course, this suffers from the does not terminate early issue,
but as this thread encourages us to look at functions we might
not otherwise consider, I thought I'd follow the suggestion, and
perhaps someone might want to modify it with a 'mapAccumL'
with cutoff, and demonstrate whether it matches the spec;-)

Claus

-- view transformation: reducing the level of abstraction

takeList ns xs = zipWith take ns $ takeWhile (not.null) $ scanl (flip drop) xs 
ns

-- fetch definitions of 'zipWith', 'takeWhile', and 'scanl'

takeList ns xs = zipWith take ns $ takeWhile (not.null) $ scanl (flip drop) xs 
ns
 where scanl f q ls = q : case  ls of
[] - []
x:xs - scanl f (f q x) xs
   takeWhile _ [] = []
   takeWhile p (x:xs) | p x   = x : takeWhile p xs
  | otherwise = []
   zipWith f (a:as) (b:bs) = f a b : zipWith f as bs
   zipWith _ _  _  = []

-- specialize for 'take', 'not.null', and 'flip drop'

takeList ns xs = zipWith ns $ takeWhile $ scanl xs ns
 where scanl q ls = q : case  ls of
[] - []
x:xs - scanl (drop x q) xs
   takeWhile []= []
   takeWhile (x:xs) | not (null x) = x : takeWhile xs
| otherwise= []
   zipWith (a:as) (b:bs) = take a b : zipWith as bs
   zipWith _  _  = []

-- fuse 'takeWhile' and 'scanl' into 'tws'

takeList ns 

Re: [Haskell-cafe] Use unsafePerformIO to catch Exception?

2009-03-26 Thread Henning Thielemann


On Thu, 26 Mar 2009, Jules Bean wrote:

There are programming styles which avoid using 'head'. You are free to use 
those if you don't like it. I myself am content to use 'head' on lists which 
I know are guaranteed to be non-empty.


Since I became aware that viewl (Data.Sequence) and uncons (ByteString) 
are appropriate in most cases where I used 'head' (and 'tail' and 'null') 
before, I use 'viewL' from my utility-ht package. Data.List.HT.viewL is 
total. I also often use Data.List.HT.switchL.

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] [ANN] I/O library for Windows

2009-03-26 Thread Felix Martini
 As I understand it, programs compiled with GHC currently use MSYS for all
 I/O operations, resulting in all kinds of strange behaviour in corner cases.
 (E.g., if you use System.Directory and ask whether C:\\ is a directory, it
 says no, yet you can read the contents of that directory.) I would have
 thought that calling the Win32 API directly would probably fix most of these
 minor glitches. Is that what this package is intending to do?

Yes. When Simon adds the new Handle API to GHC i will add support for
creating Haskell handles that will be implemented internally with
Windows handles and call WinAPI functions. The current winio package
contains mostly low-level functions that directly expose Windows
handles which is useful for writing server code etc.

Eventually when the library is stable i would prefer if it becomes
part of GHC's libraries (or a new default Haskell I/O library for all
operating systems).

Regards,
Felix
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] about Haskell code written to be too smart

2009-03-26 Thread Lutz Donnerhacke
* Claus Reinke wrote:
 Continuing our adventures into stylistic and semantic differences:-)

It's good practice to keep a simple minded version of the code and using
quickcheck to try to find differences between the optimized and trivial
version. It's good practice to even check, that the optimized version is
really faster/smaller than the simple one.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] g++ std:map vs GHC IntMap

2009-03-26 Thread Manlio Perillo

Hi.

I have tried to compare performance of the g++ std::map versus the GHC 
IntMap.


The test consists in adding 1000 elements to an empty map.

Haskell code is here:
http://hpaste.org/fastcgi/hpaste.fcgi/view?id=2899

C++ code is here:
http://hpaste.org/fastcgi/hpaste.fcgi/view?id=2900


The execution time and CPU usage is almost the same.
However the C++ version requires 305 MB, the GHC version 617 MB.

gcc 4.3.2
ghc 6.8.2
on Debian Linux Lenny, i386.


Isn't really possible to optimize memory usage in cases like this?


I also tried with Data.HashTable:
http://hpaste.org/fastcgi/hpaste.fcgi/view?id=2902

but memory usage is 703 MB, and execution time is about 4.5 times slower!


Thanks  Manlio Perillo


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] about Haskell code written to be too smart

2009-03-26 Thread Manlio Perillo

Claus Reinke ha scritto:

Continuing our adventures into stylistic and semantic differences:-)



Can you write this analysis on the wiki?

Thanks!

Manlio
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Exception handling in numeric computations

2009-03-26 Thread Xiao-Yong Jin
John Lato jwl...@gmail.com writes:

 Even in Haskell this separation isn't absolute.  Programmer errors,
 such as dividing by 0, can and do lead to exceptional conditions.  The
 proper way to handle dividing by 0 is to not do it in the first place,
 but if it happens because of a programming error, you've got an
 exception.  Unfortunately this encourages programmers to think that
 handling the exception is the proper way to deal with this condition,
 but it isn't.

So I have another question.  Is the following function safe
and legitimate?

 safeDiv :: (Exception e, Integral a) =
a - a - Either e a
 safeDiv x y = unsafePerformIO . try . evaluate $ div x y

I believe it should be okay to use this 'safeDiv'.  What do
you think?
-- 
c/*__o/*
\ * (__
*/\  
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Exception handling in numeric computations

2009-03-26 Thread Henning Thielemann


On Thu, 26 Mar 2009, Xiao-Yong Jin wrote:


So I have another question.  Is the following function safe
and legitimate?


safeDiv :: (Exception e, Integral a) =
   a - a - Either e a
safeDiv x y = unsafePerformIO . try . evaluate $ div x y


I believe it should be okay to use this 'safeDiv'.  What do


I think that question is wrong way around. The real question is, why do 
you want to solve your problem using unsafePerformIO?

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Copying data files when installing, using Cabal

2009-03-26 Thread Henk-Jan van Tuyl


L.S.,

I am trying to install an application with data files (I am developing  
that application). These data files are mentioned in the .cabal file,  
after the Data-files: keyword. When I give command runhaskell setup  
install, the data files are not copied; how can I correct this?


--
Regards,
Henk-Jan van Tuyl


--
http://functor.bamikanarie.com
http://Van.Tuyl.eu/
--


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Copying data files when installing, using Cabal

2009-03-26 Thread Neil Mitchell
Hi Henk-Jan,

It works for me, see for example HLint:
http://community.haskell.org/~ndm/darcs/hlint

And a blog I wrote on it:
http://neilmitchell.blogspot.com/2008/02/adding-data-files-using-cabal.html

The data files are copied in to the data directory upon install. The
data-files: bit must be near the start, not under executable/library
subheadings.

Thanks

Neil

On Thu, Mar 26, 2009 at 3:58 PM, Henk-Jan van Tuyl hjgt...@chello.nl wrote:

 L.S.,

 I am trying to install an application with data files (I am developing that
 application). These data files are mentioned in the .cabal file, after the
 Data-files: keyword. When I give command runhaskell setup install, the
 data files are not copied; how can I correct this?

 --
 Regards,
 Henk-Jan van Tuyl


 --
 http://functor.bamikanarie.com
 http://Van.Tuyl.eu/
 --


 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] g++ std:map vs GHC IntMap

2009-03-26 Thread Brandon S. Allbery KF8NH

On 2009 Mar 26, at 11:39, Manlio Perillo wrote:

The execution time and CPU usage is almost the same.
However the C++ version requires 305 MB, the GHC version 617 MB.


I wonder how much of that is due to lifting (i.e. laziness).

--
brandon s. allbery [solaris,freebsd,perl,pugs,haskell] allb...@kf8nh.com
system administrator [openafs,heimdal,too many hats] allb...@ece.cmu.edu
electrical and computer engineering, carnegie mellon universityKF8NH




PGP.sig
Description: This is a digitally signed message part
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] g++ std:map vs GHC IntMap

2009-03-26 Thread Bulat Ziganshin
Hello Manlio,

Thursday, March 26, 2009, 6:39:12 PM, you wrote:

 The test consists in adding 1000 elements to an empty map.

+RTS -c -F1.1

then read about garbage collection


-- 
Best regards,
 Bulatmailto:bulat.zigans...@gmail.com

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Language.Haskell.Parser question

2009-03-26 Thread Vasyl Pasternak
Hi,

I want to parse haskell file to find all calls to function 'foo' and
gathers a create a list of all
argumets, which passed to it. E.g. from the following code:

f1 = foo 5
f2 = foo 8
f3 = foo 9

 I want to extract a list [5, 8, 9] (suppouse function takes only one argument)

The most obvious way is to use Language.Haskell for this task. The
parser works pretty good,
but its output data type is terrible. As I understand, I need to
extract all objects that looks like
HsApp (HsVar (UnQual (HsIdent foo))) 

The question is, is there a method to do it quickly or I have to
process each object of different type
separately ?

Maybe it is a work for a template Haskell ?

Thanks.

-- 
Best regards,
Vasyl Pasternak
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] g++ std:map vs GHC IntMap

2009-03-26 Thread Manlio Perillo

Brandon S. Allbery KF8NH ha scritto:

On 2009 Mar 26, at 11:39, Manlio Perillo wrote:

The execution time and CPU usage is almost the same.
However the C++ version requires 305 MB, the GHC version 617 MB.


I wonder how much of that is due to lifting (i.e. laziness).



http://hpaste.org/fastcgi/hpaste.fcgi/view?id=2911

Performances are the same; but I'm not sure if this removes all laziness.

I have changed
ins t (k,x)  = insert k x t
to
ins t (k,x)  = k `seq` x `seq` insert k x t


Manlio
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Language.Haskell.Parser question

2009-03-26 Thread Neil Mitchell
Hi

 f1 = foo 5
 f2 = foo 8
 f3 = foo 9

  I want to extract a list [5, 8, 9] (suppouse function takes only one 
 argument)

Firstly, use haskell-src-exts and Language.Haskell.Exts - its a much
better library, deals with many extensions, and gives you everything
Language.Haskell did.

 parser works pretty good,
 but its output data type is terrible. As I understand, I need to
 extract all objects that looks like
 HsApp (HsVar (UnQual (HsIdent foo))) 

It's trivial, if you use a generics solution, say  Uniplate:
http://community.haskell.org/~ndm/uniplate

[x | App foo x - universeBi src, prettyPrint foo == foo ]

I often use prettyPrint to follow down Ident/Qual/UnQual paths without
having to know as much of the data type. Using universeBi makes it
easy to find everything that occurs everywhere.

If you attempt this without using a generics library, I'd say you are
destined to fail.

Thanks

Neil
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] g++ std:map vs GHC IntMap

2009-03-26 Thread Manlio Perillo

Bulat Ziganshin ha scritto:

Hello Manlio,

Thursday, March 26, 2009, 6:39:12 PM, you wrote:


The test consists in adding 1000 elements to an empty map.


+RTS -c -F1.1

then read about garbage collection




It now requires 386 MB of memory, but is 4.7 times slower.

So, now memory required is about the same as the C++ version, but how 
can I optimize memory usage without having to tweak the garbage collector?



Thanks  Manlio

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re[2]: [Haskell-cafe] g++ std:map vs GHC IntMap

2009-03-26 Thread Bulat Ziganshin
Hello Manlio,

Thursday, March 26, 2009, 8:17:03 PM, you wrote:

 So, now memory required is about the same as the C++ version, but how
 can I optimize memory usage without having to tweak the garbage collector?

C++ doesn't use GC so why you compare?


-- 
Best regards,
 Bulatmailto:bulat.zigans...@gmail.com

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] g++ std:map vs GHC IntMap

2009-03-26 Thread Don Stewart
manlio_perillo:
 Bulat Ziganshin ha scritto:
 Hello Manlio,

 Thursday, March 26, 2009, 6:39:12 PM, you wrote:

 The test consists in adding 1000 elements to an empty map.

 +RTS -c -F1.1

 then read about garbage collection



 It now requires 386 MB of memory, but is 4.7 times slower.

 So, now memory required is about the same as the C++ version, but how  
 can I optimize memory usage without having to tweak the garbage 
 collector?

You'll need similar data structures.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re[2]: [Haskell-cafe] g++ std:map vs GHC IntMap

2009-03-26 Thread Bulat Ziganshin
Hello Don,

Thursday, March 26, 2009, 8:26:18 PM, you wrote:

 +RTS -c -F1.1

 It now requires 386 MB of memory, but is 4.7 times slower.

 So, now memory required is about the same as the C++ version, but how  
 can I optimize memory usage without having to tweak the garbage 
 collector?

 You'll need similar data structures.

can +RTS -A400m be consider as similar data structure ? :)


-- 
Best regards,
 Bulatmailto:bulat.zigans...@gmail.com

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Language.Haskell.Parser question

2009-03-26 Thread minh thu
2009/3/26 Vasyl Pasternak vasyl.paster...@gmail.com:
 Hi,

 I want to parse haskell file to find all calls to function 'foo' and
 gathers a create a list of all
 argumets, which passed to it. E.g. from the following code:

 f1 = foo 5
 f2 = foo 8
 f3 = foo 9

  I want to extract a list [5, 8, 9] (suppouse function takes only one 
 argument)

 The most obvious way is to use Language.Haskell for this task. The
 parser works pretty good,
 but its output data type is terrible. As I understand, I need to
 extract all objects that looks like
 HsApp (HsVar (UnQual (HsIdent foo))) 

 The question is, is there a method to do it quickly or I have to
 process each object of different type
 separately ?

Have a look at this:
http://neilmitchell.blogspot.com/2009/03/concise-generic-queries.html

Cheers,
Thu
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: Exception handling in numeric computations

2009-03-26 Thread John Lato
 Message: 15
 From: Xiao-Yong Jin xj2...@columbia.edu
 John Lato jwl...@gmail.com writes:

 So I have another question.  Is the following function safe
 and legitimate?

 safeDiv :: (Exception e, Integral a) =
            a - a - Either e a
 safeDiv x y = unsafePerformIO . try . evaluate $ div x y

 I believe it should be okay to use this 'safeDiv'.  What do
 you think?

Your question is unclear.  Are you asking if this function can be
safely used with current Haskell standards and implementations or are
you asking should the Haskell specification make a guarantee that this
function will do what you want?

I would answer no either way, but for different reasons.  Common to
both is: Don't use unsafe* functions unless you can prove that your
usage is safe.  That is proof as in mathematical usage; it must be
rigorous and complete.

John Lato
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Use unsafePerformIO to catch Exception?

2009-03-26 Thread Henning Thielemann


On Wed, 25 Mar 2009, wren ng thornton wrote:

Extensible exceptions are impressive, but the existence of exceptions 
outside of type annotations says something about purity.


Did I already promote explicit-exceptions package? :-)
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Really need some help understanding a solution

2009-03-26 Thread GüŸnther Schmidt

Hi guys,

I tried for days now to figure out a solution that Luke Palmer has 
presented me with, by myself, I'm getting nowhere.


He has kindly provided me with this code:

import Data.Monoid

newtype IntTrie a = IntTrie [a]
deriving Show

singleton :: (Monoid a) = Int - a - IntTrie a
singleton ch x = IntTrie $ replicate ch mempty ++ [x] ++ repeat mempty

lookupTrie :: IntTrie a - Int - a
lookupTrie (IntTrie xs) n = xs !! n

instance (Monoid a) = Monoid (IntTrie a) where
mempty= IntTrie (repeat mempty)
mappend (IntTrie xs) (IntTrie ys) = IntTrie (infZipWith mappend xs ys)

infZipWith f ~(x:xs) ~(y:ys) = f x y : infZipWith f xs ys

test =  mconcat [singleton (n `mod` 42) [n] | n - [0..]] `lookupTrie` 10

It's supposed to eventually help me group a list of key value pairs and 
then further process them in a linear (streaming like) way.


The original list being something like [('a', 23), ('b', 18), ('a', 34) 
...].


There are couple of techniques employed in this solution, but I'm just 
guessing here.


The keywords I've been looking up so far:

Memmoization, Deforestation, Single Pass, Linear Map and some others.

Can someone please fill me in?

Günther

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Exception handling in numeric computations

2009-03-26 Thread Xiao-Yong Jin
Henning Thielemann lemm...@henning-thielemann.de writes:

 On Thu, 26 Mar 2009, Xiao-Yong Jin wrote:

 So I have another question.  Is the following function safe
 and legitimate?

 safeDiv :: (Exception e, Integral a) =
a - a - Either e a
 safeDiv x y = unsafePerformIO . try . evaluate $ div x y

 I believe it should be okay to use this 'safeDiv'.  What do

 I think that question is wrong way around. The real question is, why
 do you want to solve your problem using unsafePerformIO?

I just want to know, from a theoretical point of view,
whether this 'safeDiv' in above definition is the same as

 safeDiv' :: (Exception e, Integral a) =
 a - a - Either e a
 safeDiv' _ 0 = Left e
 safeDiv' x y = Right $ div x y

For the question why do I want to do that, I am not sure.  I
guess if the function which has an error call inside is
provided by other library package, and I don't have a clear
and easy way to tell whether the function will make the
error call or not, it would be easy just to make a wrapper
like that.  It's also a possible situation that I don't know
how to test the input to a foreign function call.
-- 
c/*__o/*
\ * (__
*/\  
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Exception handling in numeric computations

2009-03-26 Thread Jonathan Cast
On Thu, 2009-03-26 at 14:23 -0400, Xiao-Yong Jin wrote:
 Henning Thielemann lemm...@henning-thielemann.de writes:
 
  On Thu, 26 Mar 2009, Xiao-Yong Jin wrote:
 
  So I have another question.  Is the following function safe
  and legitimate?
 
  safeDiv :: (Exception e, Integral a) =
 a - a - Either e a
  safeDiv x y = unsafePerformIO . try . evaluate $ div x y
 
  I believe it should be okay to use this 'safeDiv'.  What do
 
  I think that question is wrong way around. The real question is, why
  do you want to solve your problem using unsafePerformIO?
 
 I just want to know, from a theoretical point of view,
 whether this 'safeDiv' in above definition is the same as
 
  safeDiv' :: (Exception e, Integral a) =
  a - a - Either e a
  safeDiv' _ 0 = Left e
  safeDiv' x y = Right $ div x y

You need some sort of type case here to make sure your first case
matches only if e is the right type for divide-by-zero errors (too lazy
to look it up atm).  Alternatively, you could replace your type variable
e with the actual exception type you want, here and in the
unsafePerformIO version.

Other than that, I think the imprecise exceptions paper guarantees that
these two functions are equivalent (albeit unwisely: see below).

 For the question why do I want to do that, I am not sure.  I
 guess if the function which has an error call inside is
 provided by other library package, and I don't have a clear
 and easy way to tell whether the function will make the
 error call or not, it would be easy just to make a wrapper
 like that.

It might be easy, but if you didn't have a lot of insight into the
function's behavior, then it would be difficult to tell whether it's
really going to call error or whether it's going to go off into an
infinite loop.  (Consider the (slow) definition

x ^ n | n == 0= 1
  | n   0= error Negative exponents require ^^
  | otherwise = x * x ^ (n - 1)

Now consider what happens if the library function forgets the second
case.  Your wrapper isn't safe anymore!)

I can see only two cases where a library function could call error
sometimes, and you wouldn't have a good feel for when:

 a) The function is calling error on exceptions.  You should bug the
library author to put the function into an exception monad instead.
Devil-may-care users can use

either (error . show) id

to turn exceptions into errors.

 b) The function has explicit pre-conditions, which you don't
understand.  You shouldn't pass arguments to a function that violate its
pre-conditions (ever!); if you don't understand those preconditions well
enough to test them in Haskell code, you might not understand them well
enough to make sure your code is calling the function correctly.  So you
might want to study the preconditions a little more.

 It's also a possible situation that I don't know
 how to test the input to a foreign function call.

FFI calls cannot throw Haskell exceptions.

jcc


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Exception handling in numeric computations

2009-03-26 Thread Claus Reinke

safeDiv :: (Exception e, Integral a) =
   a - a - Either e a
safeDiv x y = unsafePerformIO . try . evaluate $ div x y


I just want to know, from a theoretical point of view,
whether this 'safeDiv' in above definition is the same as


safeDiv' :: (Exception e, Integral a) =
a - a - Either e a
safeDiv' _ 0 = Left e
safeDiv' x y = Right $ div x y


No. Firstly, safeDiv' doesn't compile!-) Then, if you replace
'e' by 'DivideByZero' and adjust the types:

   *Main safeDiv 1 (throw Overflow)
   Left arithmetic overflow
   *Main safeDiv' 1 (throw Overflow)
   *** Exception: arithmetic overflow

Try ':info ArithException' for more in the same group. You
could use other functions in Control.Exceptions to get more
control about which exceptions you want to handle and how,
but so far, there is no indication that 'unsafePerformIO' is
the right hammer to use here..

Claus

-- unsagePerformIO: some things are just not wise to do

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


AW: Exception handling in numeric computations (was Re: [Haskell-cafe]Use unsafePerformIO to catch Exception?)

2009-03-26 Thread Kemps-Benedix Torsten
All this is certainly true. But there is still a valid concern: Numerical tasks 
often allow to predict whether an error might occur or not. Let's say you know 
that you have a normal matrix N and want to calculate (1+N*N)^-1. Of course the 
matrix is invertible and therefore it is reasonable to provide two distinct 
implementations:

invMat :: Mat - Maybe Mat
invMat = ...

-- |Like 'invMat' but gives an error in case the matrix is not invertible.
invMat' :: Mat - Mat
invMat' = fromJust . invMat

Kind regards
 
Torsten


-Ursprüngliche Nachricht-
Von: haskell-cafe-boun...@haskell.org [mailto:haskell-cafe-boun...@haskell.org] 
Im Auftrag von Henning Thielemann
Gesendet: Dienstag, 24. März 2009 21:34
An: Xiao-Yong Jin
Cc: haskell-cafe@haskell.org
Betreff: Re: Exception handling in numeric computations (was Re: 
[Haskell-cafe]Use unsafePerformIO to catch Exception?)


On Tue, 24 Mar 2009, Xiao-Yong Jin wrote:

 invMat :: Matrix - Matrix

 You won't be able to invert all the matrix, mathematically.
 And computationally, even a larger set of matrix might fail
 to be inverted because of the finite precision.  It is
 relatively easier and more efficient to spot such a problem
 within this 'invMat' function.  Because testing the
 singularity of a matrix is equally hard as invert it.  So
 all I can do when 'invMat' spot a singular matrix are

  a) Return Either/Maybe to signal an error.

This is the way to go.

  b) Wrap it in a monad.

Either and Maybe are monads. These monads behave like exceptions in other 
languages. I like to call these exceptions.

  c) Define a dynamic exception and throw it.

You cannot throw an exception in code that does not return Maybe, Either, 
IO or such things. You can only abuse 'undefined' and turn it into a 
defined value later by a hack. Think of 'undefined' as an infinite loop, 
that cannot detected in general. GHC is kind enough to detect special 
cases, in order to simplify debugging. But this should be abused for 
exceptional return values.

 The problem is that there will be many functions using such
 a function to invert a matrix, making this inversion
 function return Either/Maybe or packing it in a monad is
 just a big headache.  It is impractical to use method (a),
 because not every function that uses 'invMat' knows how to
 deal with 'invMat' not giving an answer.

How shall it deal with 'undefined' then? 'undefined' can only be handled 
by a hack, so Maybe or Either are clearly better.

 invMat :: Matrix - NumericCancerMonad Matrix

 It hides the exceptional nature of numerical computations
 very well, but it is cancer in the code.  Whenever any
 function wants to use invMat, it is mutated.  This is just
 madness.

No it makes explicit what's going on. This is the idea of functional 
programming. You have nice Applicative infix operators in order to write 
everything in a functional look anyway. In contrast, I think it is mad 
that there is no function of type
   mulInt :: Int - Int - Maybe Int
  which allows us to catch overflows without using hacks. This function 
could be easily integrated in a compiler, since the CPUs show by a flag, 
that an overflow occurs. In most high level languages this is not 
possible, and thus programming in an overflow-proof way needs additional 
computations.

  You don't want to make all the code to be monadic just because of 
 singularities in numeric calculation. Therefore, in my opinion, method 
 (c) is my only option.

Then better stick to MatLab. :-(
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: mapM as a Space Leak (Was: [Haskell-cafe] about Haskell code written to be too smart)

2009-03-26 Thread Thomas Hartman
 I wonder if JHC
 or some other compiler might work better with these examples?

Are you saying that different compilers might give different answers?

Yikes!

Too clever indeed!

2009/3/26  rocon...@theorem.ca:
 On Wed, 25 Mar 2009, Thomas Hartman wrote:

 With the state version, there's a lot of behind-the-scenes magic, and
 as we've seen, things can go wrong.

 Also, the issue isn't infinite lists, but lists that are longer than
 the sum of the partitions provided. The state monad partition version
 goes equally as badly awry if the test is restructured as

 testP pf = mapM_ putStrLn  [
         show . pf ( take 1000 [3,7..] ) $ [1..10]
         , show . pf [3,7,11,15] $ ( take (10^6) [1..])
         , show . head . last $ pf (take 1000 $ [3,3..]) [1..10^6]
       ]

 This is interesting.  It seems to be the familiar issue that sequence does
 not play as nicely with the GC as one might imagine:
 http://www.reddit.com/r/haskell/comments/7itbi/mapm_mapm_and_monadic_statements/c06rwnb?context=1

 I suspect this may be a general problem that we will keep encountering when
 using higher-order functions, at least with this compiler.  I wonder if JHC
 or some other compiler might work better with these examples?

 --
 Russell O'Connor                                      http://r6.ca/
 ``All talk about `theft,''' the general counsel of the American Graphophone
 Company wrote, ``is the merest claptrap, for there exists no property in
 ideas musical, literary or artistic, except as defined by statute.''
 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


mapM as a Space Leak (Was: [Haskell-cafe] about Haskell code written to be too smart)

2009-03-26 Thread roconnor

On Wed, 25 Mar 2009, Thomas Hartman wrote:


With the state version, there's a lot of behind-the-scenes magic, and
as we've seen, things can go wrong.

Also, the issue isn't infinite lists, but lists that are longer than
the sum of the partitions provided. The state monad partition version
goes equally as badly awry if the test is restructured as

testP pf = mapM_ putStrLn  [
 show . pf ( take 1000 [3,7..] ) $ [1..10]
 , show . pf [3,7,11,15] $ ( take (10^6) [1..])
 , show . head . last $ pf (take 1000 $ [3,3..]) [1..10^6]
   ]


This is interesting.  It seems to be the familiar issue that sequence does 
not play as nicely with the GC as one might imagine:

http://www.reddit.com/r/haskell/comments/7itbi/mapm_mapm_and_monadic_statements/c06rwnb?context=1

I suspect this may be a general problem that we will keep encountering 
when using higher-order functions, at least with this compiler.  I wonder 
if JHC or some other compiler might work better with these examples?


--
Russell O'Connor  http://r6.ca/
``All talk about `theft,''' the general counsel of the American Graphophone
Company wrote, ``is the merest claptrap, for there exists no property in
ideas musical, literary or artistic, except as defined by statute.''
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: mapM as a Space Leak (Was: [Haskell-cafe] about Haskell code written to be too smart)

2009-03-26 Thread Jonathan Cast
On Thu, 2009-03-26 at 12:29 -0700, Thomas Hartman wrote:
  I wonder if JHC
  or some other compiler might work better with these examples?
 
 Are you saying that different compilers might give different answers?
 
 Yikes!
 
 Too clever indeed!

No, they might produce code with different performance characteristics.

Which is very much what you want; there is no way to compile Haskell
such that reasonable-looking code is

 a) Fast and
 b) Predictably performant.

The idea of Haskell is to abstract away from the predictable performance
of the code by a) using a good compiler, and b) putting absolute
un-questioning faith in your profiler.

jcc


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: mapM as a Space Leak (Was: [Haskell-cafe] about Haskell code written to be too smart)

2009-03-26 Thread Thomas Hartman
Well, that's reassuring.

The reason I asked is that the testp function didn't just show poor
performance. The state monad implementation actually gave a different
answer -- nonterminating, where the pattern matching solution
terminated.

2009/3/26 Jonathan Cast jonathancc...@fastmail.fm:
 On Thu, 2009-03-26 at 12:29 -0700, Thomas Hartman wrote:
  I wonder if JHC
  or some other compiler might work better with these examples?

 Are you saying that different compilers might give different answers?

 Yikes!

 Too clever indeed!

 No, they might produce code with different performance characteristics.

 Which is very much what you want; there is no way to compile Haskell
 such that reasonable-looking code is

  a) Fast and
  b) Predictably performant.

 The idea of Haskell is to abstract away from the predictable performance
 of the code by a) using a good compiler, and b) putting absolute
 un-questioning faith in your profiler.

 jcc



___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] ANNOUNCE: wxAsteroids 1.0

2009-03-26 Thread Henk-Jan van Tuyl


Your space ship enters an asteroid belt, try to avoid
collisions!

wxAsteroids is a game demonstrating the wxHaskell GUI.

More about this at:
  http://www.haskell.org/haskellwiki/wxAsteroids

--
Met vriendelijke groet,
Henk-Jan van Tuyl


--
http://functor.bamikanarie.com
http://Van.Tuyl.eu/
--


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Language.Haskell.Parser question

2009-03-26 Thread John Van Enk
Actually, looking at the docs for UniplateStr[1], isn't there an error in
the following example statement in the Queries section?

vals x = [Val i | i - universe x]

Shouldn't that be:

vals x = [i | Val i - universe x]

?

/jve

1.
http://hackage.haskell.org/packages/archive/uniplate/1.2.0.3/doc/html/Data-Generics-UniplateStr.html


On Thu, Mar 26, 2009 at 1:47 PM, minh thu not...@gmail.com wrote:

 2009/3/26 Vasyl Pasternak vasyl.paster...@gmail.com:
  Hi,
 
  I want to parse haskell file to find all calls to function 'foo' and
  gathers a create a list of all
  argumets, which passed to it. E.g. from the following code:
 
  f1 = foo 5
  f2 = foo 8
  f3 = foo 9
 
   I want to extract a list [5, 8, 9] (suppouse function takes only one
 argument)
 
  The most obvious way is to use Language.Haskell for this task. The
  parser works pretty good,
  but its output data type is terrible. As I understand, I need to
  extract all objects that looks like
  HsApp (HsVar (UnQual (HsIdent foo))) 
 
  The question is, is there a method to do it quickly or I have to
  process each object of different type
  separately ?

 Have a look at this:
 http://neilmitchell.blogspot.com/2009/03/concise-generic-queries.html

 Cheers,
 Thu
 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] ANN: io-capture-0.2 capturing std(out|err) in IO action

2009-03-26 Thread Yusaku Hashimoto

Hi, I announce the release of io-capture package of version 0.2

io-capture is a Library to capturing stdout and stderr in IO
action. It exports a function named `capture', It takes an IO
and an String, the action to run and the given whole stdin, and
returns whole stdout and stderr in the action as String.

For example:

ghci :m +System.IO.Capture
ghci :m +System.IO
ghci capture (getLine = putStr  getLine = hPutStr stderr) foo 
\nbar\n

(foo,bar)

io-capture is available on hackage, repository is on patch-tag.

  http://hackage.haskell.org/cgi-bin/hackage-scripts/package/io- 
capture-0.2

  http://patch-tag.com/repo/io-capture/browse

It behaves almost fine, but It's still experimental, I know some
trouble about when given lazy reading action such as getContents. And
there may be unknown bugs. If you find it, feel free to report it and
any feedbacks are welcome.

Thanks for reading,

Yusaku Hashimoto

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Language.Haskell.Parser question

2009-03-26 Thread John Van Enk
Excellent!
The black magic had me scratching my head until I realized it was broken
magic. :)

/jve


On Thu, Mar 26, 2009 at 4:23 PM, Neil Mitchell ndmitch...@gmail.com wrote:

 Hi John,

  Actually, looking at the docs for UniplateStr[1], isn't there an error in
  the following example statement in the Queries section?
 
  vals x = [Val i | i - universe x]
 
  Shouldn't that be:
  vals x = [i | Val i - universe x]

 Yep, you are indeed right. I've fixed the examples in the darcs
 version, and next time there is a release these changes will be
 incorporated.

 Many thanks

 Neil

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Language.Haskell.Parser question

2009-03-26 Thread Neil Mitchell
Hi John,

 Actually, looking at the docs for UniplateStr[1], isn't there an error in
 the following example statement in the Queries section?

 vals x = [Val i | i - universe x]

 Shouldn't that be:
 vals x = [i | Val i - universe x]

Yep, you are indeed right. I've fixed the examples in the darcs
version, and next time there is a release these changes will be
incorporated.

Many thanks

Neil
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Compile OpenGL for jhc

2009-03-26 Thread Csaba Hruska
Hi!

As OpenGL haskell binding depends only on base, it would be great to compile
it with jhc. I've tried to do that, but I had some problems with
preprocessor directives.
It would be also great to add jhc support for OpenGL bindings build scripts.

Or, it would be better to add jhc support for cabal-install. :)

Is there anyone who has any idea any of this things? (for soulution)
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Language.Haskell.Parser question

2009-03-26 Thread Neil Mitchell
 Excellent!
 The black magic had me scratching my head until I realized it was broken
 magic. :)

I should probably rerelease Uniplate so the documentation gets fixed,
but its not actually broken - although there is no way it does what
I intended!

vals x = [Val i | i - universe x]

Given the type rules:

Val :: Int - Expr
universe :: a - [a]

And the knowledge that:

universe (i :: Int) = [i]

We can conclude that i :: Int and x :: Int. Therefore the function is
equivalent to:

vals :: Int - [Expr]
vals i = [Val i]

Certainly not what I was going for, but it does work! You could even
argue it would be sensible to name such a function vals...

Thanks

Neil


 On Thu, Mar 26, 2009 at 4:23 PM, Neil Mitchell ndmitch...@gmail.com wrote:

 Hi John,

  Actually, looking at the docs for UniplateStr[1], isn't there an error
  in
  the following example statement in the Queries section?
 
  vals x = [Val i | i - universe x]
 
  Shouldn't that be:
  vals x = [i | Val i - universe x]

 Yep, you are indeed right. I've fixed the examples in the darcs
 version, and next time there is a release these changes will be
 incorporated.

 Many thanks

 Neil


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Really need some help understanding a solution

2009-03-26 Thread Luke Palmer
On Thu, Mar 26, 2009 at 12:21 PM, GüŸnther Schmidt gue.schm...@web.dewrote:

 Hi guys,

 I tried for days now to figure out a solution that Luke Palmer has
 presented me with, by myself, I'm getting nowhere.


Sorry, I meant to respond earlier.

They say you don't really understand something until you can explain it to a
six year old.  So trying to explain this to a colleague made me realize how
little I must understand it :-).  But I'll try by saying whatever come to
mind...

*Lazy* list processing is all about *right* associativity.  We need to be
able to output some information knowing that our input looks like a:b:c:...,
where we don't know the ...  I see IntTrie [a] as an infinite collection of
lists (well, it is [[a]], after all :-), one for each integer.  So I want to
take a structure like this:

(1,2):(3,4):(3,5):...

And turn it into a structure like this:

{
0 - ...
1 - 2:...
2 - ...
3 - 4:5:...
...
}

(This is just a list in my implementation, but I intended it to be a trie,
ideally, which is why I wrote the keys explicitly)

So the yet-unknown information at the tail of the list turns into
yet-unknown information about the tails of the keys.  In fact, if you
replace ... with _|_, you get exactly the same thing (this is no
coincidence!)

The spine of this trie is maximally lazy: this is key.  If the structure of
the spine depended on the input data (as it does for Data.Map), then we
wouldn't be able to process infinite data, because we can never get it all.
So even making a trie out of the list _|_ gives us:

{ 0 - _|_, 1 - _|_, 2 - _|_, ... }

I.e. the keys are still there.  Then we can combine two tries just by
combining them pointwise (which is what infZipWith does).  It is essential
that the pattern matches on infZipWith are lazy. We're zipping together an
infinite sequence of lists, and normally the result would be the length of
the shortest one, which is unknowable.  So the lazy pattern match forces the
result ('s spine) to be infinite.

Umm... yeah, that's a braindump.   Sorry I couldn't be more helpful.  I'm
happy to answer any specific questions.

Luke



 He has kindly provided me with this code:

 import Data.Monoid

 newtype IntTrie a = IntTrie [a]
deriving Show

 singleton :: (Monoid a) = Int - a - IntTrie a
 singleton ch x = IntTrie $ replicate ch mempty ++ [x] ++ repeat mempty

 lookupTrie :: IntTrie a - Int - a
 lookupTrie (IntTrie xs) n = xs !! n

 instance (Monoid a) = Monoid (IntTrie a) where
mempty= IntTrie (repeat mempty)
mappend (IntTrie xs) (IntTrie ys) = IntTrie (infZipWith mappend xs ys)

 infZipWith f ~(x:xs) ~(y:ys) = f x y : infZipWith f xs ys

 test =  mconcat [singleton (n `mod` 42) [n] | n - [0..]] `lookupTrie` 10

 It's supposed to eventually help me group a list of key value pairs and
 then further process them in a linear (streaming like) way.

 The original list being something like [('a', 23), ('b', 18), ('a', 34)
 ...].

 There are couple of techniques employed in this solution, but I'm just
 guessing here.

 The keywords I've been looking up so far:

 Memmoization, Deforestation, Single Pass, Linear Map and some others.

 Can someone please fill me in?

 Günther

 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Compile OpenGL for jhc

2009-03-26 Thread Thomas DuBuisson
I don't know about OpenGL specifically, but there is the --jhc flag
for cabal-install, ex:

cabal --jhc Crypto

Thomas

2009/3/26 Csaba Hruska csaba.hru...@gmail.com:
 Hi!

 As OpenGL haskell binding depends only on base, it would be great to compile
 it with jhc. I've tried to do that, but I had some problems with
 preprocessor directives.
 It would be also great to add jhc support for OpenGL bindings build scripts.
 Or, it would be better to add jhc support for cabal-install. :)

 Is there anyone who has any idea any of this things? (for soulution)



 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Compile OpenGL for jhc

2009-03-26 Thread Thomas DuBuisson
Sorry for the double but I messed up the example.  That should have been:

cabal --jhc install Crypto



On Thu, Mar 26, 2009 at 2:10 PM, Thomas DuBuisson
thomas.dubuis...@gmail.com wrote:
 I don't know about OpenGL specifically, but there is the --jhc flag
 for cabal-install, ex:

    cabal --jhc Crypto

 Thomas

 2009/3/26 Csaba Hruska csaba.hru...@gmail.com:
 Hi!

 As OpenGL haskell binding depends only on base, it would be great to compile
 it with jhc. I've tried to do that, but I had some problems with
 preprocessor directives.
 It would be also great to add jhc support for OpenGL bindings build scripts.
 Or, it would be better to add jhc support for cabal-install. :)

 Is there anyone who has any idea any of this things? (for soulution)



 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe



___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Compile OpenGL for jhc

2009-03-26 Thread Csaba Hruska
Here is the error message of: cabal --jhc install OpenGL

http://pastebin.com/m541052a3

2009/3/26 Thomas DuBuisson thomas.dubuis...@gmail.com

 Sorry for the double but I messed up the example.  That should have been:

cabal --jhc install Crypto



 On Thu, Mar 26, 2009 at 2:10 PM, Thomas DuBuisson
 thomas.dubuis...@gmail.com wrote:
  I don't know about OpenGL specifically, but there is the --jhc flag
  for cabal-install, ex:
 
 cabal --jhc Crypto
 
  Thomas
 
  2009/3/26 Csaba Hruska csaba.hru...@gmail.com:
  Hi!
 
  As OpenGL haskell binding depends only on base, it would be great to
 compile
  it with jhc. I've tried to do that, but I had some problems with
  preprocessor directives.
  It would be also great to add jhc support for OpenGL bindings build
 scripts.
  Or, it would be better to add jhc support for cabal-install. :)
 
  Is there anyone who has any idea any of this things? (for soulution)
 
 
 
  ___
  Haskell-Cafe mailing list
  Haskell-Cafe@haskell.org
  http://www.haskell.org/mailman/listinfo/haskell-cafe
 
 
 

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: Grouping - Map / Reduce

2009-03-26 Thread Heinrich Apfelmus
Luke Palmer wrote:
 On Tue, Mar 24, 2009 at 3:15 PM, Gü?nther Schmidt gue.schm...@web.dewrote:
 
 Hi,

 let say I got an unordered lazy list of key/value pairs like

 [('a', 99), ('x', 42), ('a', 33) ... ]

 and I need to sum up all the values with the same keys.

 So far I wrote a naive implementation, using Data.Map, foldl and
 insertWith.

 The result of this grouping operation, which is effectively another list
 of key/value pairs, just sums this time, needs to be further processed.

 The building of this map is of course a bottleneck, the successive
 processing needs to wait until the entire list is eventually consumed
 the Map is built and flattened again.

 Is there another way of doing this, something more streaming
 architecture like?
 
 
 Yeah, make a trie.  Here's a quick example.

Nice!

There was a thread about this question a few years ago, with some very
interesting developments like the blueprint technique by Bertram
Felgenhauer.

  http://thread.gmane.org/gmane.comp.lang.haskell.cafe/15135


Regards,
apfelmus

--
http://apfelmus.nfshost.com

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: Grouping - Map / Reduce - blueprint

2009-03-26 Thread Henning Thielemann


On Thu, 26 Mar 2009, Heinrich Apfelmus wrote:


Luke Palmer wrote:


Yeah, make a trie.  Here's a quick example.


Nice!

There was a thread about this question a few years ago, with some very
interesting developments like the blueprint technique by Bertram
Felgenhauer.

 http://thread.gmane.org/gmane.comp.lang.haskell.cafe/15135


I remember that this thread yielded a Wiki page, which seems to be gone 
together with Hawiki. I have at least started a page on the new Wiki in 
order to preserve the link to that discussion:

  http://haskell.org/haskellwiki/Blueprint
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] an OS-independent executeFile??

2009-03-26 Thread Vasili I. Galchin
Hello,

  I have been looking through Hackage and using Hoogle to fork and
execute a program in an OS-independent way, i.e. neutral from POSIX and
Win32 APIs. Does such a library function exist?

Regards, Vasili
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] an OS-independent executeFile??

2009-03-26 Thread Jonathan Cast
On Thu, 2009-03-26 at 17:16 -0500, Vasili I. Galchin wrote:
 Hello,
 
   I have been looking through Hackage and using Hoogle to fork
 and execute a program in an OS-independent way, i.e. neutral from
 POSIX and Win32 APIs. Does such a library function exist?

System.Process.createProcess
( 
http://haskell.org/ghc/docs/latest/html/libraries/process/System-Process.html#v%3AcreateProcess
 ) works on both Unix and Windows.[1]

jcc

[1] This is not the same as OS-independent!


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] an OS-independent executeFile??

2009-03-26 Thread Vasili I. Galchin
ok .. how about API independent? ;^)

On Thu, Mar 26, 2009 at 5:14 PM, Jonathan Cast jonathancc...@fastmail.fmwrote:

 On Thu, 2009-03-26 at 17:16 -0500, Vasili I. Galchin wrote:
  Hello,
 
I have been looking through Hackage and using Hoogle to fork
  and execute a program in an OS-independent way, i.e. neutral from
  POSIX and Win32 APIs. Does such a library function exist?

 System.Process.createProcess
 (
 http://haskell.org/ghc/docs/latest/html/libraries/process/System-Process.html#v%3AcreateProcess)
  works on both Unix and Windows.[1]

 jcc

 [1] This is not the same as OS-independent!



___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Language.Haskell.Parser question

2009-03-26 Thread Vasyl Pasternak
Wow, uniplate is the library from my dreams :)

I knew there should be easy, elegant and simple solution (I hate brute
force, that's why I start using Haskell).

Many thanks.

2009/3/26 Neil Mitchell ndmitch...@gmail.com:
 Hi

 f1 = foo 5
 f2 = foo 8
 f3 = foo 9

  I want to extract a list [5, 8, 9] (suppouse function takes only one 
 argument)

 Firstly, use haskell-src-exts and Language.Haskell.Exts - its a much
 better library, deals with many extensions, and gives you everything
 Language.Haskell did.

 parser works pretty good,
 but its output data type is terrible. As I understand, I need to
 extract all objects that looks like
 HsApp (HsVar (UnQual (HsIdent foo))) 

 It's trivial, if you use a generics solution, say  Uniplate:
 http://community.haskell.org/~ndm/uniplate

 [x | App foo x - universeBi src, prettyPrint foo == foo ]

 I often use prettyPrint to follow down Ident/Qual/UnQual paths without
 having to know as much of the data type. Using universeBi makes it
 easy to find everything that occurs everywhere.

 If you attempt this without using a generics library, I'd say you are
 destined to fail.

 Thanks

 Neil




-- 
Best regards,
Vasyl Pasternak
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] an OS-independent executeFile??

2009-03-26 Thread Jonathan Cast
On Thu, 2009-03-26 at 17:27 -0500, Vasili I. Galchin wrote:
 ok .. how about API independent? ;^)

Last I checked VMS, OS/360 (NB: not dead by a long shot), etc. had APIs
too.

What you really mean is `does not break when run against Windows's
pseudo-POSIX API despite Microsoft's best efforts' :)

jcc


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] an OS-independent executeFile??

2009-03-26 Thread Vasili I. Galchin
;^)

On Thu, Mar 26, 2009 at 5:32 PM, Jonathan Cast jonathancc...@fastmail.fmwrote:

 On Thu, 2009-03-26 at 17:27 -0500, Vasili I. Galchin wrote:
  ok .. how about API independent? ;^)

 Last I checked VMS, OS/360 (NB: not dead by a long shot), etc. had APIs
 too.

 What you really mean is `does not break when run against Windows's
 pseudo-POSIX API despite Microsoft's best efforts' :)

 jcc



___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: Really need some help understanding a solution

2009-03-26 Thread Gü?nther Schmidt

Dear Luke,

let me thank you first of all for your response, period, and let me also 
assure you that your efforts are not (totally) in vain. :)



Of course I had all kinds of guesses what sort of dark arts you employed 
here, mostly really long shots (Memoization, CPS ...). And at the end of 
the day it turns out to be a lot simpler than that.


You've basically chosen a data structure that can be consumed while it's 
still being built (due to Lazy Evaluation). I certainly haven't figured 
it all out and it will take a lot more time to play with Tries before I 
have.


I'm already eager to sift through the other goodies that are on your 
blog once I have finished my app, very interesting stuff even if can't 
understand half of it. :)


Would you happen to know a good starting point about tries?

Günther

Luke Palmer schrieb:
On Thu, Mar 26, 2009 at 12:21 PM, GüŸnther Schmidt gue.schm...@web.de 
mailto:gue.schm...@web.de wrote:


Hi guys,

I tried for days now to figure out a solution that Luke Palmer has
presented me with, by myself, I'm getting nowhere.


Sorry, I meant to respond earlier.

They say you don't really understand something until you can explain it 
to a six year old.  So trying to explain this to a colleague made me 
realize how little I must understand it :-)..  But I'll try by saying 
whatever come to mind...


/Lazy/ list processing is all about /right/ associativity.  We need to 
be able to output some information knowing that our input looks like 
a:b:c:..., where we don't know the ...  I see IntTrie [a] as an infinite 
collection of lists (well, it is [[a]], after all :-), one for each 
integer.  So I want to take a structure like this:


(1,2):(3,4):(3,5):...

And turn it into a structure like this:

{
0 - ...
1 - 2:...
2 - ...
3 - 4:5:...
...
}

(This is just a list in my implementation, but I intended it to be a 
trie, ideally, which is why I wrote the keys explicitly)


So the yet-unknown information at the tail of the list turns into 
yet-unknown information about the tails of the keys.  In fact, if you 
replace  with _|_, you get exactly the same thing (this is no 
coincidence!)


The spine of this trie is maximally lazy: this is key.  If the structure 
of the spine depended on the input data (as it does for Data.Map), then 
we wouldn't be able to process infinite data, because we can never get 
it all.  So even making a trie out of the list _|_ gives us:


{ 0 - _|_, 1 - _|_, 2 - _|_, ... }

I.e. the keys are still there.  Then we can combine two tries just by 
combining them pointwise (which is what infZipWith does).  It is 
essential that the pattern matches on infZipWith are lazy. We're zipping 
together an infinite sequence of lists, and normally the result would be 
the length of the shortest one, which is unknowable.  So the lazy 
pattern match forces the result ('s spine) to be infinite.


Umm... yeah, that's a braindump.   Sorry I couldn't be more helpful.  
I'm happy to answer any specific questions.


Luke



___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Grouping - Map / Reduce

2009-03-26 Thread Peter Verswyvelen
I'm also learning Haskell so the solution below might be (1) inefficient and
(2) incorrect, but hey, let's give it a try :-)
For simplicity, in the testing code, I assume an infinite list of key/value
pairs where they keys are of type Char between 'a' and 'z' and the values
are Integers (the code also seems to work for keys with just a lower bound
but no upper bound)

I think the code speaks for itself

 import System.Random

runningSumsOfValuesPerKey :: (Eq k, Num v) = [k] - [(k, v)] - [[v]]
runningSumsOfValuesPerKey allPossibleKeys = runningSums . allValuesPerKey
  where
runningSums = map (scanl (+) 0)
allValuesPerKey pairs = [ valuesWithKey key pairs | key -
allPossibleKeys ]
valuesWithKey key = map snd . filter ((==key) . fst)

-- Testing
randomPairs :: [(Char, Integer)]
randomPairs = zip keys values
  where
keys= randomRs ('a','z') rnd1
values  = randomRs (0,9) rnd2
(rnd1,rnd2) = split (mkStdGen 0)

test = map (take 10) [rs `atKey` 'c', rs `atKey` 'z']
  where
rs = runningSumsOfValuesPerKey ['a'..] randomPairs
xs `atKey` k = xs !! (fromEnum k - fromEnum 'a')
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] about Haskell code written to be too smart

2009-03-26 Thread wren ng thornton

Colin Adams wrote:

2009/3/25 wren ng thornton w...@freegeek.org:
   Most of the documentation is in research papers, and a normal
   programmer don't want to read these papers.

 Yes, and no. There is quite a bit of documentation in research papers, and
 mainstream programmers don't read research. However, this is a big part of
 what makes the Haskell community what it is. There are plenty of
 non-academics here, but they have the willingness to read these papers (even
 if it's out of the ordinary) and the desire to learn radical new things
 (because they're out of the ordinary).

Yes.
BUT ...

when I look up the Haddock-generated documentation for a function, I
DON'T appreciate it if that is in the form of a hyperlink to a
research paper.
And that occurs in several of the libraries shipped with GHC for instance.

A reference to a research paper is fine to show where the ideas came
from, but that is not where the library documentation should be.



Yeah, that's bad. 'Documentation' like that should be corrected with 
Extreme Prejudice.


--
Live well,
~wren
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Really need some help understanding a solution

2009-03-26 Thread Thomas Hartman
Luke, does your explanation to Guenther have anything to do with
coinduction? -- the property that a producer gives a little bit of
output at each step of recursion, which a consumer can than crunch in
a lazy way?

I find that coinduction seems to figure frequently in algos that
process a stream.

So, Guenther, I'm not certain if coinduction figures in here yet but
it gives you another thing to google on. Co-induction seems to play
the same role for stream processing in haskell that tail recursiveness
plays in non-lazy languages
like lisp. That is, it's kind of the ideal to be striven for. Whereas
in haskell, tail recursive is frequently not the best thing because it
goes into a non-terminating state when there is an infinite data
structure which is crunched down to a finite one but at the wrong
point in the function pipeline.

see 
http://groups.google.com/group/fa.haskell/browse_thread/thread/4240bc7c7abd4d30/49f28f5a41519335?q=it+is+however+nicely+coinductive#49f28f5a41519335


2009/3/26 Luke Palmer lrpaln...@gmail.com:
 On Thu, Mar 26, 2009 at 12:21 PM, GüŸnther Schmidt gue.schm...@web.de
 wrote:y

 Hi guys,

 I tried for days now to figure out a solution that Luke Palmer has
 presented me with, by myself, I'm getting nowhere.

 Sorry, I meant to respond earlier.

 They say you don't really understand something until you can explain it to a
 six year old.  So trying to explain this to a colleague made me realize how
 little I must understand it :-).  But I'll try by saying whatever come to
 mind...

 Lazy list processing is all about right associativity.  We need to be able
 to output some information knowing that our input looks like a:b:c:...,
 where we don't know the ...  I see IntTrie [a] as an infinite collection of
 lists (well, it is [[a]], after all :-), one for each integer.  So I want to
 take a structure like this:

 (1,2):(3,4):(3,5):...

 And turn it into a structure like this:

 {
 0 - ...
 1 - 2:...
 2 - ...
 3 - 4:5:...
 ...
 }

 (This is just a list in my implementation, but I intended it to be a trie,
 ideally, which is why I wrote the keys explicitly)

 So the yet-unknown information at the tail of the list turns into
 yet-unknown information about the tails of the keys.  In fact, if you
 replace ... with _|_, you get exactly the same thing (this is no
 coincidence!)

 The spine of this trie is maximally lazy: this is key.  If the structure of
 the spine depended on the input data (as it does for Data.Map), then we
 wouldn't be able to process infinite data, because we can never get it all.
 So even making a trie out of the list _|_ gives us:

 { 0 - _|_, 1 - _|_, 2 - _|_, ... }

 I.e. the keys are still there.  Then we can combine two tries just by
 combining them pointwise (which is what infZipWith does).  It is essential
 that the pattern matches on infZipWith are lazy. We're zipping together an
 infinite sequence of lists, and normally the result would be the length of
 the shortest one, which is unknowable.  So the lazy pattern match forces the
 result ('s spine) to be infinite.

 Umm... yeah, that's a braindump.   Sorry I couldn't be more helpful.  I'm
 happy to answer any specific questions.

 Luke



 He has kindly provided me with this code:

 import Data.Monoid

 newtype IntTrie a = IntTrie [a]
    deriving Show

 singleton :: (Monoid a) = Int - a - IntTrie a
 singleton ch x = IntTrie $ replicate ch mempty ++ [x] ++ repeat mempty

 lookupTrie :: IntTrie a - Int - a
 lookupTrie (IntTrie xs) n = xs !! n

 instance (Monoid a) = Monoid (IntTrie a) where
    mempty                            = IntTrie (repeat mempty)
    mappend (IntTrie xs) (IntTrie ys) = IntTrie (infZipWith mappend xs ys)

 infZipWith f ~(x:xs) ~(y:ys) = f x y : infZipWith f xs ys

 test =  mconcat [singleton (n `mod` 42) [n] | n - [0..]] `lookupTrie` 10

 It's supposed to eventually help me group a list of key value pairs and
 then further process them in a linear (streaming like) way.

 The original list being something like [('a', 23), ('b', 18), ('a', 34)
 ...].

 There are couple of techniques employed in this solution, but I'm just
 guessing here.

 The keywords I've been looking up so far:

 Memmoization, Deforestation, Single Pass, Linear Map and some others.

 Can someone please fill me in?

 Günther

 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe


 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Really need some help understanding a solution

2009-03-26 Thread Thomas Hartman
Re that link: search for wren's comments containing it is however
nicely coinductive

2009/3/26 Thomas Hartman tphya...@gmail.com:
 Luke, does your explanation to Guenther have anything to do with
 coinduction? -- the property that a producer gives a little bit of
 output at each step of recursion, which a consumer can than crunch in
 a lazy way?

 I find that coinduction seems to figure frequently in algos that
 process a stream.

 So, Guenther, I'm not certain if coinduction figures in here yet but
 it gives you another thing to google on. Co-induction seems to play
 the same role for stream processing in haskell that tail recursiveness
 plays in non-lazy languages
 like lisp. That is, it's kind of the ideal to be striven for. Whereas
 in haskell, tail recursive is frequently not the best thing because it
 goes into a non-terminating state when there is an infinite data
 structure which is crunched down to a finite one but at the wrong
 point in the function pipeline.

 see 
 http://groups.google.com/group/fa.haskell/browse_thread/thread/4240bc7c7abd4d30/49f28f5a41519335?q=it+is+however+nicely+coinductive#49f28f5a41519335


 2009/3/26 Luke Palmer lrpaln...@gmail.com:
 On Thu, Mar 26, 2009 at 12:21 PM, GüŸnther Schmidt gue.schm...@web.de
 wrote:y

 Hi guys,

 I tried for days now to figure out a solution that Luke Palmer has
 presented me with, by myself, I'm getting nowhere.

 Sorry, I meant to respond earlier.

 They say you don't really understand something until you can explain it to a
 six year old.  So trying to explain this to a colleague made me realize how
 little I must understand it :-).  But I'll try by saying whatever come to
 mind...

 Lazy list processing is all about right associativity.  We need to be able
 to output some information knowing that our input looks like a:b:c:...,
 where we don't know the ...  I see IntTrie [a] as an infinite collection of
 lists (well, it is [[a]], after all :-), one for each integer.  So I want to
 take a structure like this:

 (1,2):(3,4):(3,5):...

 And turn it into a structure like this:

 {
 0 - ...
 1 - 2:...
 2 - ...
 3 - 4:5:...
 ...
 }

 (This is just a list in my implementation, but I intended it to be a trie,
 ideally, which is why I wrote the keys explicitly)

 So the yet-unknown information at the tail of the list turns into
 yet-unknown information about the tails of the keys.  In fact, if you
 replace ... with _|_, you get exactly the same thing (this is no
 coincidence!)

 The spine of this trie is maximally lazy: this is key.  If the structure of
 the spine depended on the input data (as it does for Data.Map), then we
 wouldn't be able to process infinite data, because we can never get it all.
 So even making a trie out of the list _|_ gives us:

 { 0 - _|_, 1 - _|_, 2 - _|_, ... }

 I.e. the keys are still there.  Then we can combine two tries just by
 combining them pointwise (which is what infZipWith does).  It is essential
 that the pattern matches on infZipWith are lazy. We're zipping together an
 infinite sequence of lists, and normally the result would be the length of
 the shortest one, which is unknowable.  So the lazy pattern match forces the
 result ('s spine) to be infinite.

 Umm... yeah, that's a braindump.   Sorry I couldn't be more helpful.  I'm
 happy to answer any specific questions.

 Luke



 He has kindly provided me with this code:

 import Data.Monoid

 newtype IntTrie a = IntTrie [a]
    deriving Show

 singleton :: (Monoid a) = Int - a - IntTrie a
 singleton ch x = IntTrie $ replicate ch mempty ++ [x] ++ repeat mempty

 lookupTrie :: IntTrie a - Int - a
 lookupTrie (IntTrie xs) n = xs !! n

 instance (Monoid a) = Monoid (IntTrie a) where
    mempty                            = IntTrie (repeat mempty)
    mappend (IntTrie xs) (IntTrie ys) = IntTrie (infZipWith mappend xs ys)

 infZipWith f ~(x:xs) ~(y:ys) = f x y : infZipWith f xs ys

 test =  mconcat [singleton (n `mod` 42) [n] | n - [0..]] `lookupTrie` 10

 It's supposed to eventually help me group a list of key value pairs and
 then further process them in a linear (streaming like) way.

 The original list being something like [('a', 23), ('b', 18), ('a', 34)
 ...].

 There are couple of techniques employed in this solution, but I'm just
 guessing here.

 The keywords I've been looking up so far:

 Memmoization, Deforestation, Single Pass, Linear Map and some others.

 Can someone please fill me in?

 Günther

 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe


 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe



___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Use unsafePerformIO to catch Exception?

2009-03-26 Thread wren ng thornton

Jules Bean wrote:

wren ng thornton wrote:
 I have long been disappointed by a number of `error`s which shouldn't 
 be. For example, the fact that `head` and `div` are not total strikes 
 me as a (solvable) weakness of type checking, rather than things that 
 should occur as programmer errors/exceptions at runtime. The use of 
 `error` in these cases muddies the waters and leads to a laissez-faire 
 attitude about when it's acceptable to throw one's hands up in despair 
 and use `error` rather than writing a better function.


head uses error in precisely the correct, intended fashion.

head has a precondition (only call on non-empty lists)


And that is *exactly* my complaint: the precondition is not verified by 
the compiler. Therefore it does not exist in the semantic system, which 
is why the error screws up semantic analysis.


The type of head should not be [a] - a + Error, it should be (a:[a]) - 
a. With the latter type the compiler can ensure the precondition will be 
proved before calling head, thus eliminating erroneous calls.


It's a static error, detectable statically, and yet it's deferred to the 
runtime. I'd much rather the compiler catch my errors than needing to 
create an extensive debugging suite and running it after compilation. Is 
this not the promise of purity?



There are programming styles which avoid using 'head'. You are free to 
use those if you don't like it. I myself am content to use 'head' on 
lists which I know are guaranteed to be non-empty.


Avoiding head is not a tenable solution. The syntax for case matching is 
insufficiently first-class, so avoiding the function often leads to 
contortions, illegible code, or reinventing the problem. Moreover, this 
isn't an isolated case; fromJust is another canonical example, as is any 
other partial algebra-homomorphism. Many mathematical functions like div 
are also subject to trivially-verified invariants on their arguments.


Functions like uncons and viewL are nicer (because they're safe), but 
they can have overhead because they're unnecessarily complete (e.g. the 
Maybe wrapper can be avoided if we know a-priori that Just will be the 
constructor used).


--
Live well,
~wren
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: Really need some help understanding a solution

2009-03-26 Thread Gü?nther Schmidt

Well Folks,

I've been programming for almost a decade now and making a living off it 
for almost 8 years.


To me programing in Haskell is sometimes quite a humbling experience, 
because I come to realize how shallow my ventures so far were.


The depth this language has is just amazing and the stuff that is 
tackled in this language is just  aaahhh. Can't quite put it in 
words, maybe something along the lines the ultimate thing, key to the 
universe I don't know.


Humbling and frustrating especially when you come across the Lukes and 
they do to you what he did to me in about 12 lines of code. And then you 
come across the blogs of the Lukes and see all the other things where 
you stand in awe and your jaw drops. And you realize how far away from 
that you still are.


And now you!

Sry for that Thomas, my prescription ran out today. I'll check on the 
link first thing once I got a refill.


Günther

Karl-Hinz, mei Drobbe!

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: about Haskell code written to be too smart

2009-03-26 Thread wren ng thornton

John Lato wrote:

 From: wren ng thornton w...@freegeek.org
 Dan Weston wrote:
  So to be clear with the terminology:
 
  inductive   = good consumer?
  coinductive = good producer?
 
  So fusion should be possible (automatically? or do I need a GHC rule?) with
inductive . coinductive
 
  Or have I bungled it?

 Not quite. Induction means starting from base cases and building things
 upwards from those. Coinduction is the dual and can be thought of as
 starting from the ceiling and building your way downwards (until you hit
 the base cases, or possibly forever).

( quite a lot of text trimmed for brevity)

Wren, thank you for contributing this post.  Coming from the point of
view of someone who doesn't grok it all yet, this is the best
commentary I've read on deforestation/fusion, and in fact is helpful
for category theory newbies as well (at least I found it so).  It
really should go on a wiki somewhere.


If you haven't read Rewriting Haskell Strings[1] yet, it's an excellent 
paper. There are a number of other canonical fusion papers, but RHS has 
the best introduction I've seen and it goes into both the build/fold and 
unfold/destroy styles. It doesn't have much on the category theory and 
recursion theory side of things though.


[1] http://www.cse.unsw.edu.au/~dons/papers/CSL06.html

--
Live well,
~wren
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Use unsafePerformIO to catch Exception?

2009-03-26 Thread Alexander Dunlap
On Thu, Mar 26, 2009 at 5:23 PM, wren ng thornton w...@freegeek.org wrote:
 Jules Bean wrote:

 wren ng thornton wrote:
  I have long been disappointed by a number of `error`s which shouldn't 
  be. For example, the fact that `head` and `div` are not total strikes  me
  as a (solvable) weakness of type checking, rather than things that  should
  occur as programmer errors/exceptions at runtime. The use of  `error` in
  these cases muddies the waters and leads to a laissez-faire  attitude 
  about
  when it's acceptable to throw one's hands up in despair  and use `error`
  rather than writing a better function.

 head uses error in precisely the correct, intended fashion.

 head has a precondition (only call on non-empty lists)

 And that is *exactly* my complaint: the precondition is not verified by the
 compiler. Therefore it does not exist in the semantic system, which is why
 the error screws up semantic analysis.

 The type of head should not be [a] - a + Error, it should be (a:[a]) - a.
 With the latter type the compiler can ensure the precondition will be proved
 before calling head, thus eliminating erroneous calls.

 It's a static error, detectable statically, and yet it's deferred to the
 runtime. I'd much rather the compiler catch my errors than needing to create
 an extensive debugging suite and running it after compilation. Is this not
 the promise of purity?

Ultimately, it's not detectable statically, is it? Consider

import Control.Applicative

main = do
  f - lines $ readFile foobar
  print (head (head f))

You can't know whether or not head will crash until runtime.

Alex
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] ANN: FallingBlocks 0.1

2009-03-26 Thread Ben Sanders
Hello,

I just uploaded fallingblocks to Hackage.  It is another Tetris clone, but
it uses SDL, and I thought there could be more SDL examples.

Any and all comments and suggestions will be extremely appreciated!

There is a darcs repo at
http://patch-tag.com/publicrepos/fallingblocks

Cheers!
-Ben
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: mapM as a Space Leak

2009-03-26 Thread Chung-chieh Shan
Thomas Hartman tphya...@gmail.com wrote in article 
910ddf450903261240p4e4fc8b3pa927fac1b80b2...@mail.gmail.com in 
gmane.comp.lang.haskell.cafe:
 Well, that's reassuring.
 
 The reason I asked is that the testp function didn't just show poor
 performance. The state monad implementation actually gave a different
 answer -- nonterminating, where the pattern matching solution
 terminated.

Indeed, DIFFERENT Haskell programs can give different answers, even on
the SAME Haskell implementation.  That should not be surprising at all.

-- 
Edit this signature at http://www.digitas.harvard.edu/cgi-bin/ken/sig
100 Days to close Guantanamo and end torture http://100dayscampaign.org/
http://www.avaaz.org/en/end_the_war_on_terror/

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Use unsafePerformIO to catch Exception?

2009-03-26 Thread Luke Palmer
On Thu, Mar 26, 2009 at 6:42 PM, Alexander Dunlap 
alexander.dun...@gmail.com wrote:

 On Thu, Mar 26, 2009 at 5:23 PM, wren ng thornton w...@freegeek.org
 wrote:
  It's a static error, detectable statically, and yet it's deferred to the
  runtime. I'd much rather the compiler catch my errors than needing to
 create
  an extensive debugging suite and running it after compilation. Is this
 not
  the promise of purity?

 Ultimately, it's not detectable statically, is it? Consider

 import Control.Applicative

 main = do
  f - lines $ readFile foobar
  print (head (head f))

 You can't know whether or not head will crash until runtime.


Static checkers are usually conservative, so this would be rejected.  In
fact, it's not always essential to depend on external information; eg. this
program:

(\x y - y) (\x - x x) (\x - x)

Behaves just like the identity function, so surely it should have type a -
a, but it is rejected because type checking is (and must be) conservative.

Keeping constraints around that check that head is well-formed is a pretty
hard thing to do.  Case expressions are easier to check for totality, but
have the disadvantages that wren mentions.

Much as we would like to pressure the language to support static
constraints, I don't think we are yet in a position to.  It's not the kind
of thing you can just throw on and be done with it; see Conor McBride's
current project for an example of the issues involved.

Luke
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Use unsafePerformIO to catch Exception?

2009-03-26 Thread wren ng thornton

Alexander Dunlap wrote:

wren ng thornton wrote:
 Jules Bean wrote:
  head uses error in precisely the correct, intended fashion.
 
  head has a precondition (only call on non-empty lists)

 And that is *exactly* my complaint: the precondition is not verified by the
 compiler. Therefore it does not exist in the semantic system, which is why
 the error screws up semantic analysis.

 The type of head should not be [a] - a + Error, it should be (a:[a]) - a.
 With the latter type the compiler can ensure the precondition will be proved
 before calling head, thus eliminating erroneous calls.

 It's a static error, detectable statically, and yet it's deferred to the
 runtime. I'd much rather the compiler catch my errors than needing to create
 an extensive debugging suite and running it after compilation. Is this not
 the promise of purity?

Ultimately, it's not detectable statically, is it? Consider

import Control.Applicative

main = do
  f - lines $ readFile foobar
  print (head (head f))

You can't know whether or not head will crash until runtime.


The issue is the obligation of proof, not what the actual runtime values 
are. Types give a coarsening of the actual runtime values, but they're 
still fine-grained enough to catch many errors (e.g. we know readFile 
will return some String and not an Int, even if we don't know which 
particular string it'll return).


The proposal is to enhance the vocabulary of types such that we can 
require certain new proofs (e.g. having head require the proof that its 
argument is non-empty). The only way to discharge the obligation is to 
provide a proof. In this case, the way we provide a proof is by using a 
case expression--- it knows the actual runtime value because it 
evaluates things enough to match the patterns. For example, consider


let x = ... in case x of
   x...@p1 - f x1 ...
   x...@p2 - g x2 ...

Under the current system x, x1, and x2 all have the same type, by 
definition. We can relax this however since it is guaranteed that by the 
time x1 enters scope it will be bound to a value that adheres to the 
pattern p1; and we similarly know that x2 must be bound to a value that 
adheres to p2 (Removing things that also match p1). With that in mind, 
if the function f requires a proof of p1 about x1 (or g requires proof 
of p2 about x2), those obligations are discharged because the calls to f 
and g are guarded by the case expression.


Given such a type system, your example would fail to type check because 
lines does not offer the guarantee that the return value is of type 
((a:[a]):[[a]]). That is, the compiler will tell you that you need to 
provide proofs, i.e. handle all cases.


As always, the devil is in the details; but that's what research is for :)

--
Live well,
~wren
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Use unsafePerformIO to catch Exception?

2009-03-26 Thread wren ng thornton

Luke Palmer wrote:

Alexander Dunlap wrote:
 Ultimately, it's not detectable statically, is it? Consider

 import Control.Applicative

 main = do
  f - lines $ readFile foobar
  print (head (head f))

 You can't know whether or not head will crash until runtime.

Static checkers are usually conservative, so this would be rejected.  In
fact, it's not always essential to depend on external information; eg. this
program:

(\x y - y) (\x - x x) (\x - x)

Behaves just like the identity function, so surely it should have type a -
a, but it is rejected because type checking is (and must be) conservative.

Keeping constraints around that check that head is well-formed is a pretty
hard thing to do.  Case expressions are easier to check for totality, but
have the disadvantages that wren mentions.


My idea amounts to trying to make case expressions more first-class than 
they are now. As Luke says, we'd have to be conservative about it (until 
the dependent-types and total-functional-programming folks do the 
impossible), but I think there's still plenty of room for biting off a 
useful chunk of the domain without falling off that cliff.


--
Live well,
~wren
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Exception handling in numeric computations

2009-03-26 Thread wren ng thornton

Jonathan Cast wrote:

Xiao-Yong Jin wrote:
  Xiao-Yong Jin wrote:
   So I have another question.  Is the following function safe
   and legitimate?
  
   safeDiv :: (Exception e, Integral a) =
  a - a - Either e a
   safeDiv x y = unsafePerformIO . try . evaluate $ div x y

 safeDiv' :: (Exception e, Integral a) =
 a - a - Either e a
 safeDiv' _ 0 = Left e
 safeDiv' x y = Right $ div x y

[...]
Other than that, I think the imprecise exceptions paper guarantees that
these two functions are equivalent (albeit unwisely: see below).


I don't think so. The evaluation of x and y may throw errors before we 
get around to div.


* safeDiv' will evaluate y (to pattern match against 0) and may return 
an error, e, whereas safeDiv will return Left e if div is strict in y.


* safeDiv' postpones evaluating x and so may return Right e, whereas 
safeDiv will return Left e if div is strict in x.



--
Live well,
~wren
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Exception handling in numeric computations

2009-03-26 Thread Jonathan Cast
On Thu, 2009-03-26 at 21:57 -0400, wren ng thornton wrote:
 Jonathan Cast wrote:
  Xiao-Yong Jin wrote:
Xiao-Yong Jin wrote:
 So I have another question.  Is the following function safe
 and legitimate?

 safeDiv :: (Exception e, Integral a) =
a - a - Either e a
 safeDiv x y = unsafePerformIO . try . evaluate $ div x y
  
   safeDiv' :: (Exception e, Integral a) =
   a - a - Either e a
   safeDiv' _ 0 = Left e
   safeDiv' x y = Right $ div x y
  
  [...]
  Other than that, I think the imprecise exceptions paper guarantees that
  these two functions are equivalent (albeit unwisely: see below).
 
 I don't think so. The evaluation of x and y may throw errors before we 
 get around to div.

Sure.  Which also points out that the original safeDiv wasn't actually
safe, since there's no guarantee of what evaluate will do with x and y.
(Actually, there's not much guarantee of what evaluate does anyway ---
just that any errors in e's definition get turned into exceptions by the
time evaluate e finishes running, or don't turn into exceptions at all).

 * safeDiv' will evaluate y (to pattern match against 0) and may return 
 an error, e, whereas safeDiv will return Left e if div is strict in y.
 
 * safeDiv' postpones evaluating x and so may return Right e, whereas 
 safeDiv will return Left e if div is strict in x.

jcc


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: about Haskell code written to be too smart

2009-03-26 Thread Achim Schneider
wren ng thornton w...@freegeek.org wrote:

 Colin Adams wrote:
  2009/3/25 wren ng thornton w...@freegeek.org:
  when I look up the Haddock-generated documentation for a function, I
  DON'T appreciate it if that is in the form of a hyperlink to a
  research paper.
  And that occurs in several of the libraries shipped with GHC for
  instance.
  
  A reference to a research paper is fine to show where the ideas came
  from, but that is not where the library documentation should be.
 
 Yeah, that's bad. 'Documentation' like that should be corrected with 
 Extreme Prejudice.
 
The main problem with research papers as documentation is the papers
usually being outdated wrt. the current library version: Literate
Haskell is utterly underused.


-- 
(c) this sig last receiving data processing entity. Inspect headers
for copyright history. All rights reserved. Copying, hiring, renting,
performance and/or quoting of this signature prohibited.


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Really need some help understanding a solution

2009-03-26 Thread wren ng thornton

Gü?nther Schmidt wrote:
The depth this language has is just amazing and the stuff that is 
tackled in this language is just  aaahhh. Can't quite put it in 
words, maybe something along the lines the ultimate thing, key to the 
universe I don't know.


Humbling and frustrating especially when you come across the Lukes and 
they do to you what he did to me in about 12 lines of code. And then you 
come across the blogs of the Lukes and see all the other things where 
you stand in awe and your jaw drops. And you realize how far away from 
that you still are.


If you enjoy jaw-dropping brain-exploding fun, have you seen Martin 
Escardo's work on exhaustively searching infinite spaces?



http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/
http://math.andrej.com/2008/11/21/a-haskell-monad-for-infinite-search-in-finite-time/

--
Live well,
~wren
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Really need some help understanding a solution

2009-03-26 Thread wren ng thornton

Thomas Hartman wrote:

Luke, does your explanation to Guenther have anything to do with
coinduction? -- the property that a producer gives a little bit of
output at each step of recursion, which a consumer can than crunch in
a lazy way?


It has more to do with tying the knot (using laziness to define values 
in terms of themselves), though there are similarities. Take the function:


infZipWith f ~(x:xs) ~(y:ys) = f x y : infZipWith f xs ys

which we could write less clearly as:

infZipWith f xxs yys =
f (head xxs) (head yys) : infZipWith f (tail xxs) (tail yys)

The trick is that we can emit the thunk for f(head xxs)(head yys) 
without knowing what values xxs and yys actually contain--- they could 
still be _|_! The hope is that by the time we get to where someone asks 
for the value of that thunk ---by that point--- we *will* know enough 
about xxs and yys that we can give an answer.


For knot tying to work, we must ensure that we don't ask for things 
from the future before we've actually created them. If we do, then the 
function will diverge, i.e. _|_. This shares similarities with the ideal 
1-to-1 producer/consumer role for deforestation (whence, similarities to 
coinduction).


--
Live well,
~wren
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Really need some help understanding a solution

2009-03-26 Thread David Menendez
2009/3/26 Luke Palmer lrpal...@gmail.com:
 The spine of this trie is maximally lazy: this is key.  If the structure of
 the spine depended on the input data (as it does for Data.Map), then we
 wouldn't be able to process infinite data, because we can never get it all.
 So even making a trie out of the list _|_ gives us:

 { 0 - _|_, 1 - _|_, 2 - _|_, ... }

 I.e. the keys are still there.  Then we can combine two tries just by
 combining them pointwise (which is what infZipWith does).  It is essential
 that the pattern matches on infZipWith are lazy. We're zipping together an
 infinite sequence of lists, and normally the result would be the length of
 the shortest one, which is unknowable.  So the lazy pattern match forces the
 result ('s spine) to be infinite.

It's also important that (++) is non-strict in its second argument.
Using infZipWith with (+) requires examining the entire input before
getting any output.

Unfortunately, Günther's original post indicated that he wanted the
*sum* of the values for each key. Unless you're using non-strict
natural numbers, that pretty much requires examining the entire input
list before producing any output.

--

Incidentally, this also works (in that it produces the right answers):

type MultiMap k v = k - [v]

-- The Monoid instance is pre-defined; here's the relevant code:
-- mappend map1 map2 = \k - map1 k ++ map2 k

singleton :: Eq k = k - v - MultiMap k v
singleton k v = \k' - if k == k' then [v] else []

test = mconcat [ singleton (n `mod` 42) n | n - [0..]] 10

Or, using insert instead of singleton/union,

insert :: k - v - MultiMap k v - MultiMap k v
insert k v map = \k' - if k == k' then v : map k' else map k'

fromList :: Eq k = [(k,v)] - MultiMap k v
fromList = foldr (\(k,v) - insert k v) (const [])

Note that insert is non-strict in its third argument, meaning that
foldr can return an answer immediately.

-- 
Dave Menendez d...@zednenem.com
http://www.eyrie.org/~zednenem/
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe