[Haskell-cafe] Should Yesod.Mail be a separate package?

2010-10-16 Thread Michael Snoyman
Hey all,

I wrote a simple email module in Yesod[1] that handles such things as
multipart messages and Base64 encoding. It's still missing some
features (multipart/alternative, for instance), but it can be useful
for throwing together emails. It's currently part of the yesod
package, but I'm going to be moving it to a separate package to free
me up to make breaking API changes more frequently.

As of right now, I'm just going to move it into the yesod-auth package
(also being split off from the main yesod package), and therefore it
will still have all the dependencies on Yesod. My question is whether
people would find this package useful outside the scope of Yesod.
There are no dependencies from this module onto any Yesod-specific
stuff, so this separation could easily be done. I just don't feel like
adding another package to maintain if no one is interested.

So if anyone wants this offered up as a separate package, and/or has
any API suggestions, please let me know.

Michael

[1] 
http://hackage.haskell.org/packages/archive/yesod/0.5.4/doc/html/Yesod-Mail.html
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: Proving stuff about IORefs

2010-10-16 Thread Ben Franksen
Derek Elkins wrote:
> On Sat, Oct 16, 2010 at 9:21 PM, Ben Franksen 
> wrote:
>> I have a formal proof where I am stuck at a certain point.
>>
>> Suppose we have a function
>>
>> f :: IORef a -> IO b
>>
>> I want to prove that
>>
>> f r == do
>> s1 <- readIORef r
>> r' <- newIORef s1
>> x <- f r'
>> s3 <- readIORef r'
>> writeIORef r s3
>> return x
>>
>> What happens here is that the temporary IORef r' takes the place of the
>> argument r, and after we apply f to it we take its content and store it
>> in the original r. This should be the same as using r as argument to f in
>> the first place.
>>
>> How can I prove this formally?
> 
> You haven't provided us with any information about the formal model
> you are using and your question is somewhat ambiguously phrased, hence
> Thomas' response where, I'm pretty sure, he misunderstood what you
> were asking.

I don't have a model. Up to this point I can make do with equational
reasoning.

This is the context. I have this class

  class Embed i o where
type Content i o
embed :: (Content i o -> i a) -> o a
callback :: o a -> Content i o -> i a

which I _think_ should have these laws attached

  L1:embed . callback == id
  L2:callback . embed == id

and an instance

  newtype StateIO s a = StateIO { unStateIO :: StateT s IO a }

  instance Embed IO (StateIO s) where
type Content IO (StateIO s) = IORef s
embed f = StateIO $ StateT $ \s -> do
  r <- newIORef s
  x <- f r
  s' <- readIORef r
  return (x, s')
callback action r = do
  s <- readIORef r
  (x, s') <- runStateT (unStateIO action) s
  writeIORef r s'
  return x

The original idea comes from this message

  http://www.haskell.org/pipermail/haskell-cafe/2007-July/028501.html

but I have deviated somewhat from Jules' notation and generalised.

Now I want to prove the laws. L1 is straight forward:

embed (callback o)
  = { def embed }
StateIO $ StateT $ \s1 -> do
  r <- newIORef s1
  x <- callback o r
  s4 <- readIORef r
  return (x, s4)
  = { def callback }
StateIO $ StateT $ \s1 -> do
  r <- newIORef s1
  x <- do
s2 <- readIORef r
(x, s3) <- runStateT (unStateIO o) s2
writeIORef r s3
return x
  s4 <- readIORef r
  return (x, s4)
  = { Monad laws }
StateIO $ StateT $ \s1 -> do
  r <- newIORef s1
  s2 <- readIORef r
  (x, s3) <- runStateT (unStateIO o) s2
  writeIORef r s3
  s4 <- readIORef r
  return (x, s4)
  = { IORef laws }
StateIO $ StateT $ \s1 -> do
  r <- newIORef s1
  (x, s3) <- runStateT (unStateIO o) s1
  writeIORef r s3
  return (x, s3)
  = { reorder (r is unused in second stmt), Monad laws }
StateIO $ StateT $ \s1 -> do
  (x, s3) <- runStateT (unStateIO o) s1
  r <- newIORef s1
  writeIORef r s3
  return (x, s3)
  = { IORef laws }
StateIO $ StateT $ \s1 -> do
  (x, s3) <- runStateT (unStateIO o) s1
  return (x, s3)
  = { Monad laws }
StateIO $ StateT $ \s1 -> runStateT (unStateIO o) s1
  = {def StateIO, StateT }
o

You might question the step marked { IORef laws }. I don't know if this has
been formalized anywhere but I thought it safe to assume a law that states

  do
r <- newIORef a
b <- readIORef r
g b

  =

  do
r <- newIORef a
g a

assuming that a is not used any further.

Similarly I have used the "law"

  do
writeIORef r a
b <- readIORef r
g b

  =

  do
writeIORef r a
g a

Both of these are so obviously satisfied that I accept them as axioms.

Now, when I try to prove L2, I can reason similarly and get

callback (embed f) r
  = { def callback }
do
  s1 <- readIORef r
  (x, s4) <- runStateT (unStateIO (embed f)) s1
  writeIORef r s4
  return x
  = { def embed }
do
  s1 <- readIORef r
  (x, s4) <- runStateT (unStateIO $ StateIO $ StateT $ \s2 -> do
  r' <- newIORef s2
  x <- f r'
  s3 <- readIORef r'
  return (x, s3)
) s1
  writeIORef r s4
  return x
  = { def StateIO, StateT, beta reduce }
do
  s1 <- readIORef r
  (x, s4) <- do
  r' <- newIORef s1
  x <- f r'
  s3 <- readIORef r'
  return (x, s3)
  writeIORef r s4
  return x
  = { Monad laws }
do
  s1 <- readIORef r
  r' <- newIORef s1
  x <- f r'
  s3 <- readIORef r'
  writeIORef r s3
  return x
  = { IORef laws }
do
  s1 <- readIORef r
  r' <- newIORef s1
  x <- f r'
  s3 <- readIORef r'
  writeIORef r s3
  return x
  = { ??? }
f r

and I would like to reduce the last step to the same level of "obviosity" as
in the previous proof.

> At any rate, if you intend to prove this for any arbitrary f, I can't
> tell you how to prove it formally because it's not true.

How do you know that? Can you give an example where it fails?

Cheers
Ben

___

Re: [Haskell-cafe] Re: Re: A rant against the blurb on the Haskell front page

2010-10-16 Thread Jeremy Shaw

On Oct 16, 2010, at 9:03 PM, Ben Franksen wrote:

Still, "Haskell is an open source product" doesn't sound right to me.
Even "Haskell is open source" (without the "product") has a bad ring
because "source" is short for "source code" and source code is not
something a programming language has.


How about something like,

Haskell has an open design and development model driven by commercial  
and academic research, industry users, hobbyists, and other members of  
the community.


Or something to that effect ? (I'm not wild about the word 'has'  
there...)


- jeremy
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: Re: A rant against the blurb on the Haskell front page

2010-10-16 Thread Ben Franksen
wren ng thornton wrote:
> On 10/16/10 11:22 AM, Ben Franksen wrote:
>> Much better. Though I *do* think mentioning the main implementations and
>> their qualities is a good thing to o, right after this:
>>
>> "[...]The most
>> important Haskell implementation, ghc [like to ghc page], has served as a
>> test bed for practical application of cutting egde research into the
>> language as well as its compilation to efficiently executable code."
> 
> Objection to calling GHC the most "important". The most mature, most
> fully featured, most common, or even the standard implementation,, sure.
> But saying GHC is more important than the rest implies that (among
> others) the work on JHC and UHC is "unimportant". To the contrary, I
> think JHC and UHC are, perhaps, more important than GHC precisely
> because they are treading new waters that the standard implementation
> cannot afford to explore.

Right on all accounts. Can one say "most mature and full-featured" ?

Cheers
Ben

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Proving stuff about IORefs

2010-10-16 Thread Derek Elkins
On Sat, Oct 16, 2010 at 9:21 PM, Ben Franksen  wrote:
> I have a formal proof where I am stuck at a certain point.
>
> Suppose we have a function
>
>  f :: IORef a -> IO b
>
> I want to prove that
>
>  f r == do
>    s1 <- readIORef r
>    r' <- newIORef s1
>    x <- f r'
>    s3 <- readIORef r'
>    writeIORef r s3
>    return x
>
> What happens here is that the temporary IORef r' takes the place of the
> argument r, and after we apply f to it we take its content and store it in
> the original r. This should be the same as using r as argument to f in the
> first place.
>
> How can I prove this formally?

You haven't provided us with any information about the formal model
you are using and your question is somewhat ambiguously phrased, hence
Thomas' response where, I'm pretty sure, he misunderstood what you
were asking.

At any rate, if you intend to prove this for any arbitrary f, I can't
tell you how to prove it formally because it's not true.

Regardless, this email has far too little information for anyone to
provide you an answer.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Text.Regex Segfault

2010-10-16 Thread Duane Johnson
I bumped into a segmentation fault in the regex library today and thought
I'd warn others in case similar behavior is observed:

Prelude> :m Text.Regex
> Prelude Text.Regex> map read (splitRegex (mkRegex "\\|") "0|1|2|4") ::
> [Int]
> Loading package syb ... linking ... done.
> Loading package array-0.2.0.0 ... linking ... done.
> Loading package bytestring-0.9.1.4 ... linking ... done.
> Loading package regex-base-0.72.0.2 ... linking ... done.
> Loading package regex-posix-0.72.0.3 ... linking ... done.
> Loading package regex-compat-0.71.0.1 ... linking ... done.
> [0,1,2,4]
> Prelude Text.Regex> Segmentation fault
>

Regards,
Duane
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Proving stuff about IORefs

2010-10-16 Thread Thomas DuBuisson
I must be missing the point of the proof.  The value of 'f r' is _|_.
Practically speaking, it will eventually stack overflow.  Why is
proving anything about this interesting?  Why do you think the store
will ever happen on the original r?

Cheers,
Thomas

On Sat, Oct 16, 2010 at 6:21 PM, Ben Franksen  wrote:
> I have a formal proof where I am stuck at a certain point.
>
> Suppose we have a function
>
>  f :: IORef a -> IO b
>
> I want to prove that
>
>  f r == do
>    s1 <- readIORef r
>    r' <- newIORef s1
>    x <- f r'
>    s3 <- readIORef r'
>    writeIORef r s3
>    return x
>
> What happens here is that the temporary IORef r' takes the place of the
> argument r, and after we apply f to it we take its content and store it in
> the original r. This should be the same as using r as argument to f in the
> first place.
>
> How can I prove this formally?
>
> Cheers
> Ben
>
> ___
> Haskell-Cafe mailing list
> Haskell-Cafe@haskell.org
> http://www.haskell.org/mailman/listinfo/haskell-cafe
>
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Re: A rant against the blurb on the Haskell front page

2010-10-16 Thread Tony Morris


On 17/10/10 12:03, Ben Franksen wrote:
> wren ng thornton wrote:
>   
>> On 10/16/10 10:48 AM, Ben Franksen wrote:
>> 
>>> Don Stewart wrote:
>>>   
 It is open source, and was born open source. It is the product of
 research.
 
>>> How can a language be open source, or rather, how can it *not* be open
>>> source? The point of a (programming) language is that it has a published
>>> ('open') definition. Nothing prevents anyone from creating a proprietary
>>> compiler or interpreter for Haskell, AFAIK.
>>>   
>> Miranda[TM] is/was a proprietary language, quite definitively so. If
>> nothing else, this should be apparent by the fact that every reference
>> to it in research papers of the era (a) included the TM sigil, and (b)
>> had footnotes indicating who the IP holders are. That was before my
>> time, but I was under the impression that Haskell was open from the
>> beginning ---by express intention--- in order to enable work on lazy
>> functional languages without being encumbered by Miranda[TM]'s closed
>> nature.
>>
>> For that matter, until rather recently Java was very much a closed
>> language defined by the runtime system provided by Sun Microsystems and
>> not defined by the sequence of characters accepted by that system, nor
>> by the behavior of the system when it accepts them. Sun even went
>> through some trouble to try to shut out competitive development of
>> runtime systems such as SoyLatte, IcedTea, and the like.
>>
>> Even the venerable C language has a long history of companies making
>> proprietary extensions to the language in order to require you to buy
>> their compiler, and they would most certainly pursue legal action if
>> someone else copied the features. This is why GCC is as big a coup for
>> the free/open-source movement as Linux is--- long before GCC changed its
>> name and focus to being a compiler collection.
>>
>> The languages which are open-source are in close correspondence with the
>> languages which have a free/open-source implementation. There are a lot
>> of them, including the vast majority of recent languages. But don't be
>> seduced into thinking that a language is a predicate on acceptable
>> strings, a transducer from those strings into computer behaviors, or
>> that such predicates and transducers are public domain.
>> 
> Sigh. Yes, you are right, of course. All this is true, sadly. There are
> stupid people who think that they can own a programming language. I hope
> they will go the way all the other mis-adapted creatures have gone and just
> die out.
>
> Still, "Haskell is an open source product" doesn't sound right to me.
> Even "Haskell is open source" (without the "product") has a bad ring
> because "source" is short for "source code" and source code is not
> something a programming language has.
>
> I agree that "non-proprietary" is a valid and important characterization of
> the language. This should be mentioned where we speak about libraries and
> community, since the active and friendly community is the motor behind the
> growing set of libraries, and you get this sort of participation only with
> a free/non-proprietary language. This applies not only to individuals but
> to companies as well, maybe even more.
>
> I anticipate the objection that potential commercial users might be scared
> off by the terms "non-proprietary" or "free", whereas the term "open
> source" has been coined to (and probably actually does) sound more commerce
> friendly. To countermand such an effect, we can point out that most
> libraries have non-copyleft licenses and that there are a number of
> companies who have done and still do a lot to support and advance Haskell.
>
> Cheers
> Ben
>
> ___
> Haskell-Cafe mailing list
> Haskell-Cafe@haskell.org
> http://www.haskell.org/mailman/listinfo/haskell-cafe
>
>   
I am somewhat sympathetic to your argument, but I care far less overall.

Nevertheless, perhaps this would appease:

"Haskell is an open standard with a robust open source implementation."

-- 
Tony Morris
http://tmorris.net/


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: Re: A rant against the blurb on the Haskell front page

2010-10-16 Thread Ben Franksen
wren ng thornton wrote:
> On 10/16/10 10:48 AM, Ben Franksen wrote:
>> Don Stewart wrote:
>>> It is open source, and was born open source. It is the product of
>>> research.
>>
>> How can a language be open source, or rather, how can it *not* be open
>> source? The point of a (programming) language is that it has a published
>> ('open') definition. Nothing prevents anyone from creating a proprietary
>> compiler or interpreter for Haskell, AFAIK.
> 
> Miranda[TM] is/was a proprietary language, quite definitively so. If
> nothing else, this should be apparent by the fact that every reference
> to it in research papers of the era (a) included the TM sigil, and (b)
> had footnotes indicating who the IP holders are. That was before my
> time, but I was under the impression that Haskell was open from the
> beginning ---by express intention--- in order to enable work on lazy
> functional languages without being encumbered by Miranda[TM]'s closed
> nature.
> 
> For that matter, until rather recently Java was very much a closed
> language defined by the runtime system provided by Sun Microsystems and
> not defined by the sequence of characters accepted by that system, nor
> by the behavior of the system when it accepts them. Sun even went
> through some trouble to try to shut out competitive development of
> runtime systems such as SoyLatte, IcedTea, and the like.
> 
> Even the venerable C language has a long history of companies making
> proprietary extensions to the language in order to require you to buy
> their compiler, and they would most certainly pursue legal action if
> someone else copied the features. This is why GCC is as big a coup for
> the free/open-source movement as Linux is--- long before GCC changed its
> name and focus to being a compiler collection.
> 
> The languages which are open-source are in close correspondence with the
> languages which have a free/open-source implementation. There are a lot
> of them, including the vast majority of recent languages. But don't be
> seduced into thinking that a language is a predicate on acceptable
> strings, a transducer from those strings into computer behaviors, or
> that such predicates and transducers are public domain.

Sigh. Yes, you are right, of course. All this is true, sadly. There are
stupid people who think that they can own a programming language. I hope
they will go the way all the other mis-adapted creatures have gone and just
die out.

Still, "Haskell is an open source product" doesn't sound right to me.
Even "Haskell is open source" (without the "product") has a bad ring
because "source" is short for "source code" and source code is not
something a programming language has.

I agree that "non-proprietary" is a valid and important characterization of
the language. This should be mentioned where we speak about libraries and
community, since the active and friendly community is the motor behind the
growing set of libraries, and you get this sort of participation only with
a free/non-proprietary language. This applies not only to individuals but
to companies as well, maybe even more.

I anticipate the objection that potential commercial users might be scared
off by the terms "non-proprietary" or "free", whereas the term "open
source" has been coined to (and probably actually does) sound more commerce
friendly. To countermand such an effect, we can point out that most
libraries have non-copyleft licenses and that there are a number of
companies who have done and still do a lot to support and advance Haskell.

Cheers
Ben

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] An interesting paper from Google

2010-10-16 Thread Bernie Pope
On 17 October 2010 11:25, Dan Doel  wrote:
> On Saturday 16 October 2010 7:04:23 pm Ben Millwood wrote:
>> On Fri, Oct 15, 2010 at 9:28 PM, Andrew Coppin
>>
>>  wrote:
>> > I'm still quite
>> > surprised that there's no tool anywhere which will trivially print out
>> > the reduction sequence for executing an expression. You'd think this
>> > would be laughably easy, and yet nobody has done it yet.
>>
>> I tried to do something a bit like this:
>>
>> http://github.com/benmachine/stepeval
>>
>> but it could be charitably described as "crude": has three failing
>> testcases and a bagful of unimplemented functionality.
>
> I believe the Buddha debugger could do something like this, as well, although
> normally you wouldn't dump the entire sequence.

Buddha is/was a declarative debugger. The basic idea is to build a
computation tree, and then search the tree for buggy nodes. Normally
the debugger would decide how to traverse the tree, and the user would
simply make judgements about the correctness of reductions stored in
the visited nodes. However, buddha also allowed the user to explore
the tree manually. None of this was particularly unique to buddha, and
most other declarative debuggers allow you to do this too.
Unfortunately most declarative debuggers don't make it far past the
proof of concept stage.

The HAT tracer also supports/supported declarative debugging and has
many useful trace exploration tools.

> But it has bit rotted,
> unfortunately (it's quite tied to GHC internals, as far as I can tell).

As the author of buddha I can confirm that it hasn't been maintained.
The main dependency on GHC is a small amount of code for printing data
structures. In fact some of that could would be easier to do now than
it was then, because GHC includes data constructor names by default in
compiled code (this was added to support the ghci breakpoint
debugger).

> I never used it, but I've had at least one person tell me it was the best
> debugger they'd ever used. You type in an expression, and continually step
> into different parts of the reduction sequence until you find some core source
> of whatever error you're looking for.

I'm happy to hear someone like it so much. Declarative debugging is a
very nice idea (invented for Prolog by Ehud Shapiro - he is now famous
for DNA computing), but it is hard to make practical. Probably the
best declarative debugger I know of is the one provided for the
Mercury programming language. However, Mercury is a strict language,
which simplifies some aspects of the design.

The problem is that it is hard to scale to long running computations,
because the computation tree can become huge. HAT tackles this problem
by saving a trace record to file, although this can have rather high
runtime overheads. The declarative debugger for Mercury language
tackles this problem by piecemeal construction of the computation
tree, and by regenerating parts of it on demand by re-execution of
code. Re-execution in a lazy language is quite challenging (I tried to
do it in buddha).

I have some ideas about interspersing declarative debugging with
computation, but never had the time to implement it (though I think it
would make a great research project).

> If someone were to revive it, I'm sure many people would be appreciative.

I think it might be better to put more effort into HAT.

Cheers,
Bernie.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Proving stuff about IORefs

2010-10-16 Thread Ben Franksen
I have a formal proof where I am stuck at a certain point.

Suppose we have a function

  f :: IORef a -> IO b

I want to prove that

  f r == do
s1 <- readIORef r
r' <- newIORef s1
x <- f r'
s3 <- readIORef r'
writeIORef r s3
return x

What happens here is that the temporary IORef r' takes the place of the
argument r, and after we apply f to it we take its content and store it in
the original r. This should be the same as using r as argument to f in the
first place.

How can I prove this formally?

Cheers
Ben

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] An interesting paper from Google

2010-10-16 Thread wren ng thornton

On 10/16/10 8:25 PM, Dan Doel wrote:

On Saturday 16 October 2010 7:04:23 pm Ben Millwood wrote:

On Fri, Oct 15, 2010 at 9:28 PM, Andrew Coppin  wrote:

I'm still quite
surprised that there's no tool anywhere which will trivially print out
the reduction sequence for executing an expression. You'd think this
would be laughably easy, and yet nobody has done it yet.


I tried to do something a bit like this:

http://github.com/benmachine/stepeval

but it could be charitably described as "crude": has three failing
testcases and a bagful of unimplemented functionality.


I believe the Buddha debugger could do something like this, [...]
If someone were to revive it, I'm sure many people would be appreciative.


I've been wanting something like this recently. Or more particularly, 
I've been wanting a partial evaluator somewhat like Coq's simpl tactic 
(and lazy, cbv, red, hnf,...), which can look up the definitions of 
functions to inline them and continue evaluation, and which can work 
around variables and thus under lambdas. Though it would be nice to be 
able to dump the whole sequence instead of just the result.


/me wonders how much time it'd suck away to actually implement it...

--
Live well,
~wren
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Systematic treatment of static arguments

2010-10-16 Thread wren ng thornton

On 10/16/10 3:39 PM, Stephen Tetley wrote:

Hello list

The Monad and Applicative instances for functions are "equivalent" to
the respective Reader vesions (I use "equivalent" along the lines of -
operationally the same but without the type distinction / newtype).
There is also the Monoid instance for functions which is pretty slick.

Has anyone looked at variations with two or more static arguments though?

For example mappend with two static arguments needs to be defined
either as a new function:


mappendR2 :: Monoid a =>  (r1 ->  r2 ->  a) ->  (r1 ->  r2 ->  a) ->  r1 ->  r2 
->  a
mappendR2 f g = \x y ->  f x y `mappend` g x y


or an overlapping instance:


instance Monoid a =>  OPlus (r1 ->  r2 ->  a) where
   f `mappend` g = \x y ->  f x y `mappend` g x y


It's not the prettiest, but you can also just make use of uncurrying and 
have (r1,r2)->a. Via closed cartesian categories, the versions with 
additional arguments are "uninteresting": which is to say, equivalent to 
the uncurried version.


You may want to introduce a shorthand though,

> mappendR2 f g = curry (uncurry f `mappend` uncurry g)


Working in graphics I've found two static arguments comes up quite
often - preliminarily most of my functions are functions from the
DrawingContext to something (drawing context is an environment that
tracks line width, stroke colour, fill colour, etc.):
[...]
Many of my functions statically use a 'start' point as the only
coordinate reference, so they are in a "coordinate free" style:


This sounds like the (r1,r2)->a approach is even reasonable for 
capturing the actual semantics of your program. Though you may want to 
rename (Point,DrawingCtx) to something like Coordinateful_DrawingCtx.


--
Live well,
~wren
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] An interesting paper from Google

2010-10-16 Thread Dan Doel
On Saturday 16 October 2010 7:04:23 pm Ben Millwood wrote:
> On Fri, Oct 15, 2010 at 9:28 PM, Andrew Coppin
> 
>  wrote:
> > I'm still quite
> > surprised that there's no tool anywhere which will trivially print out
> > the reduction sequence for executing an expression. You'd think this
> > would be laughably easy, and yet nobody has done it yet.
> 
> I tried to do something a bit like this:
> 
> http://github.com/benmachine/stepeval
> 
> but it could be charitably described as "crude": has three failing
> testcases and a bagful of unimplemented functionality.

I believe the Buddha debugger could do something like this, as well, although 
normally you wouldn't dump the entire sequence. But it has bit rotted, 
unfortunately (it's quite tied to GHC internals, as far as I can tell).

I never used it, but I've had at least one person tell me it was the best 
debugger they'd ever used. You type in an expression, and continually step 
into different parts of the reduction sequence until you find some core source 
of whatever error you're looking for.

If someone were to revive it, I'm sure many people would be appreciative.

-- Dan
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: A rant against the blurb on the Haskell front page

2010-10-16 Thread wren ng thornton

On 10/16/10 11:22 AM, Ben Franksen wrote:

Much better. Though I *do* think mentioning the main implementations and
their qualities is a good thing to o, right after this:

"[...]The most
important Haskell implementation, ghc [like to ghc page], has served as a
test bed for practical application of cutting egde research into the
language as well as its compilation to efficiently executable code."


Objection to calling GHC the most "important". The most mature, most 
fully featured, most common, or even the standard implementation,, sure. 
But saying GHC is more important than the rest implies that (among 
others) the work on JHC and UHC is "unimportant". To the contrary, I 
think JHC and UHC are, perhaps, more important than GHC precisely 
because they are treading new waters that the standard implementation 
cannot afford to explore.


--
Live well,
~wren
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: A rant against the blurb on the Haskell front page

2010-10-16 Thread wren ng thornton

On 10/16/10 11:34 AM, Ben Franksen wrote:

Christopher Done wrote:

To solve this ambiguity that phrase is a link that people can click to
find out what it means. "Object oriented", "dynamically typed",
"stack-based" are about as meaningful.


The difference may be that everyone thinks he knows what 'object oriented'
means. But 'lazyness', 'polymorphic type system', what the heck is that?


Now it's time for my axe grinding (though, tis a wee little axe):

If "polymorphism" is mentioned anywhere in the intro, then it should be 
phrased as "parametric polymorphism" (perhaps with a footnote mention of 
GADTs). Unfortunately the term "polymorphism" has been co-opted by the 
OOP community to mean subtyping and overloading, so there will be many 
people who think they know what it means but will be wrong, because 
those are entirely different beasts than the kind of polymorphism 
Haskell supports. Using the more specific "parametric polymorphism" 
should at least give them pause before misinterpreting it.


--
Live well,
~wren
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: A rant against the blurb on the Haskell front page

2010-10-16 Thread wren ng thornton

On 10/16/10 10:48 AM, Ben Franksen wrote:

Don Stewart wrote:

It is open source, and was born open source. It is the product of
research.


How can a language be open source, or rather, how can it *not* be open
source? The point of a (programming) language is that it has a published
('open') definition. Nothing prevents anyone from creating a proprietary
compiler or interpreter for Haskell, AFAIK.


Miranda[TM] is/was a proprietary language, quite definitively so. If 
nothing else, this should be apparent by the fact that every reference 
to it in research papers of the era (a) included the TM sigil, and (b) 
had footnotes indicating who the IP holders are. That was before my 
time, but I was under the impression that Haskell was open from the 
beginning ---by express intention--- in order to enable work on lazy 
functional languages without being encumbered by Miranda[TM]'s closed 
nature.


For that matter, until rather recently Java was very much a closed 
language defined by the runtime system provided by Sun Microsystems and 
not defined by the sequence of characters accepted by that system, nor 
by the behavior of the system when it accepts them. Sun even went 
through some trouble to try to shut out competitive development of 
runtime systems such as SoyLatte, IcedTea, and the like.


Even the venerable C language has a long history of companies making 
proprietary extensions to the language in order to require you to buy 
their compiler, and they would most certainly pursue legal action if 
someone else copied the features. This is why GCC is as big a coup for 
the free/open-source movement as Linux is--- long before GCC changed its 
name and focus to being a compiler collection.


The languages which are open-source are in close correspondence with the 
languages which have a free/open-source implementation. There are a lot 
of them, including the vast majority of recent languages. But don't be 
seduced into thinking that a language is a predicate on acceptable 
strings, a transducer from those strings into computer behaviors, or 
that such predicates and transducers are public domain.


--
Live well,
~wren
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] An interesting paper from Google

2010-10-16 Thread Ben Millwood
On Fri, Oct 15, 2010 at 9:28 PM, Andrew Coppin
 wrote:
> I'm still quite
> surprised that there's no tool anywhere which will trivially print out the
> reduction sequence for executing an expression. You'd think this would be
> laughably easy, and yet nobody has done it yet.
>

I tried to do something a bit like this:

http://github.com/benmachine/stepeval

but it could be charitably described as "crude": has three failing
testcases and a bagful of unimplemented functionality.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] UTF-8 problems when decoding JSON data coming from Network.HTTP

2010-10-16 Thread Ionut G. Stan

Hi,

I'm trying to decode this JSON response: 
http://github.com/api/v2/json/user/show/igstan


As you can see, the "name" field contains a non-latin character: ț, and 
it appears that Text.JSON. can't decode this response when it comes from 
Network.HTTP. I've tried Network.HTTP.Enumerator too, but the problem 
persists. Here's a simple (hopefully) reproducible test case:


http://gist.github.com/630319

If you load it in ghci and call "main", you'll see that it doesn't 
properly show the user name. Also, calling:


request "http://github.com/api/v2/json/user/show/igstan";

will display the respective character encoded in a way that I have no 
idea whether or not is correct (Unicode is not one of my strong points 
for the moment).


Can anyone shed some light on this problem? Which package is the culprit?

Thanks,
--
Ionuț G. Stan  |  http://igstan.ro
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Golfing parsec code

2010-10-16 Thread Stephen Tetley
The avoiding /try/ is a good part of Parsec golf. Because turning
natural literals into fractions is easy (%1) it is simple to use the
/option/ parser to parse a suffix or return a default.

/symbol/ is also a valuable parser, often preferable to /char/ or
/string/ as it chomps trailing white space.


symbol = P.symbol lexer


fraction :: Parser Rational
fraction = do
num <- integer
den <- option 1 (symbol "/" >> natural)
return (num % den)
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Systematic treatment of static arguments

2010-10-16 Thread Stephen Tetley
Hello list

The Monad and Applicative instances for functions are "equivalent" to
the respective Reader vesions (I use "equivalent" along the lines of -
operationally the same but without the type distinction / newtype).
There is also the Monoid instance for functions which is pretty slick.

Has anyone looked at variations with two or more static arguments though?

For example mappend with two static arguments needs to be defined
either as a new function:

> mappendR2 :: Monoid a => (r1 -> r2 -> a) -> (r1 -> r2 -> a) -> r1 -> r2 -> a
> mappendR2 f g = \x y -> f x y `mappend` g x y

or an overlapping instance:

> instance Monoid a => OPlus (r1 -> r2 -> a) where
>   f `mappend` g = \x y -> f x y `mappend` g x y

Working in graphics I've found two static arguments comes up quite
often - preliminarily most of my functions are functions from the
DrawingContext to something (drawing context is an environment that
tracks line width, stroke colour, fill colour, etc.):

fn1 :: DrawingCtx -> a

Many of my functions statically use a 'start' point as the only
coordinate reference, so they are in a "coordinate free" style:

fn2 :: Point -> DrawingCtx -> a

Some functions even have have a third static argument, for example
drawing arrowheads for lines seems nice if the angle of the line is
parametric and not used explicitly:

fn3 :: Radian -> Point -> DrawingCtx -> a

The help from Applicative, Monad and Monoid is "used up" by the one
static argument version, so I find that I have to introduce points (in
the pointed / point-free sense not in the graphic sense) in the fn2
and fn3 versions where a point-free version might still be nice.

Is there any prior work looking at sets of combinators for these
higher arity functions - papers or code? I'd prefer not to introduce a
whole new lexicon of combinator names if I can help it.

Thanks

Stephen
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Strict Core?

2010-10-16 Thread Andrew Coppin

 On 16/10/2010 09:57 AM, Max Bolingbroke wrote:

I do not have plans to add it. I think it would be worth it - perhaps
worth a few % points in runtime - but I've started researching
supercompilation instead, which has more impressive effects :-)


From what I've seen, strict core makes several things simpler and 
easier to do. So it's not just about how much it speeds up the compiled 
code; it has the potential to make the compiler implementer's job easier.



Simon has said he is keen to use it though - it's just a big
engineering task to replumb GHC to use it, so perhaps this is a
project for an enterprising student.


...easier, that is, if you were writing it from scratch. Of course, any 
non-trivial alteration to a large existing codebase is usually a fair 
bit of work, but *especially* if you're changing really fundamental 
assumptions that pervade the entire thing...


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] downloading GHC

2010-10-16 Thread Brandon S Allbery KF8NH
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

On 10/16/10 12:07 , Ketil Malde wrote:
> Brandon S Allbery KF8NH  writes:
> 
>>> Linux users don't have easy binary installers, usually. What can we do
>>> about this bootstrapping problem?
> 
>> I thought the answer to that was supposed to be "bug your distribution to
>> package the Platform".
> 
> In my case, it's more like bug the IT department to get with the times
> and drop distributions like RHEL and CentOS.  And I do try, but to my
> perpetual chagrin, I'm not always as high on their priority list as I
> might wish...

There is that, isn't there?  And, as one of aforementioned IT department
folks (admittedly in a different context) it's not always as high on our
priority lists as we might wish, and — worse — we may have our hands tied by
someone even higher in the food chain.

(I'm going to have to drop gtk2hs here because recent versions are
incompatible with the glib (gtk+, oddly, is fine) we have installed on about
a third of the machines in the department, and we're *already* failing to
get those upgraded to something halfway modern)

- -- 
brandon s. allbery [linux,solaris,freebsd,perl]  allb...@kf8nh.com
system administrator  [openafs,heimdal,too many hats]  allb...@ece.cmu.edu
electrical and computer engineering, carnegie mellon university  KF8NH
-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.10 (Darwin)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iEYEARECAAYFAky54lcACgkQIn7hlCsL25WifgCgi/sMjBuqXm8jOIcpKnuCIVde
meQAoNbbpu2hfAedLqRHmLEZuN66zuN6
=pgJg
-END PGP SIGNATURE-
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Strict Core?

2010-10-16 Thread Atze Dijkstra
Hi,

UHC indeed has an experimental typed core, which includes strict core (and is 
also based on GHC's core, and Henk). However, for UHC it is also quite a bit of 
engineering to make it produce typed/strict core for all language features, 
especially since it has to come from untyped core generation. So, there is 
currently a translation for a small set of language features, enough to do some 
experimenting, done by Tom Lokhorst. The current state is maintained and kept 
alive to allow further development in the future, e.g. both core structures 
internally have been put behind a class interface as to allow sharing of much 
of the generation infrastructure, but this all is still in transit and 
unfinished.

cheers,
 
On  16 Oct, 2010, at 10:57 , Max Bolingbroke wrote:

> Hi Gregory,
> 
> On 15 October 2010 22:27, Gregory Crosswhite  
> wrote:
>> Out of curiosity, are there any plans for GHC to eventually use the Strict
>> Core language described in
>> http://www.cl.cam.ac.uk/~mb566/papers/tacc-hs09.pdf?
> 
> I do not have plans to add it. I think it would be worth it - perhaps
> worth a few % points in runtime - but I've started researching
> supercompilation instead, which has more impressive effects :-)
> 
> Simon has said he is keen to use it though - it's just a big
> engineering task to replumb GHC to use it, so perhaps this is a
> project for an enterprising student.
> 
> That said, I've been told that UHC's core language uses the ideas from
> Strict Core, and they have/had a student at Utretch (Tom Lokhorst) who
> was working on implementing optimisations like arity raising and deep
> unboxing for the language.
> 
> Cheers,
> Max
> ___
> Haskell-Cafe mailing list
> Haskell-Cafe@haskell.org
> http://www.haskell.org/mailman/listinfo/haskell-cafe


- Atze -

Atze Dijkstra, Department of Information and Computing Sciences. /|\
Utrecht University, PO Box 80089, 3508 TB Utrecht, Netherlands. / | \
Tel.: +31-30-2534118/1454 | WWW  : http://www.cs.uu.nl/~atze . /--|  \
Fax : +31-30-2513971  | Email: a...@cs.uu.nl  /   |___\



___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] A rant against the blurb on the Haskell front page

2010-10-16 Thread Ketil Malde
Don Stewart  writes:

>> Good start, if only the "advanced" were replaced with something more
>> characteristic, like "lazy", or "statically typed". Which, BTW, both do not

> "lazy" and "statically typed" don't mean much to other people. They are
> buzz words that mean nothing to many people.

But they /are/ defining characteristics of the language, still.  I think
they should be mentioned, ideally as links to separate pages (or
pop-ups or a "live" sidebar?) that explain what they mean, and why you'd
want them.

-k
-- 
If I haven't seen further, it is by standing in the footprints of giants
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Bulletproof resource management

2010-10-16 Thread Michael Snoyman
On Fri, Oct 15, 2010 at 8:17 PM, Antoine Latter  wrote:
> On Fri, Oct 15, 2010 at 11:09 AM, Florian Weimer  wrote:
>> * Henning Thielemann:
>>
>>> Some open/close pairs have corresponding 'with' functions, that are
>>> implemented using Exception.bracket. You can also use them within
>>> GHCi. I think using both manual resource deallocation and finalizers
>>> makes everything more complicated and more unreliable.
>>
>> It seems that Exception.bracket does not work in all cases, see the
>> recent "MonadCatchIO, finally and the error monad" thread.
>>
>> Anyway, the ability of closures (and threads) means that something
>> like Exception.bracket does not prevent access to closed handles, so
>> I still need an additional safety net.
>
> That thread is for the function "bracket" provided by the package
> MonadCatchIO. Control.Exception.bracket should work fine as far as I
> know.

I can confer that when I brought up the issue with bracket, it's only
referring to the function exported by MonadCatchIO. I sure as hell
hope The Control.Exception version always works properly ;).

Michael
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] downloading GHC

2010-10-16 Thread Ketil Malde
Brandon S Allbery KF8NH  writes:

>> Linux users don't have easy binary installers, usually. What can we do
>> about this bootstrapping problem?

> I thought the answer to that was supposed to be "bug your distribution to
> package the Platform".

In my case, it's more like bug the IT department to get with the times
and drop distributions like RHEL and CentOS.  And I do try, but to my
perpetual chagrin, I'm not always as high on their priority list as I
might wish...

As a developer, I'd like to work with cutting-edge compilers and tools.
As a user, I want to run on the conservatively configured servers in the
basement.  Glibc makes my binaries non-portable between systems, and
somewhat ironically, the same issue that prevents me to compile on my
development system and deploy on the server, also prevents me from
deploying modern development tools on the server and make a compatible
build.

It seems to me that the easiest solution is to have a different libc to
link against for static builds, but given the huge amount of complaints
about this around the net, and the glaring lack of any solution, I guess
it's not that easy.

-k
-- 
If I haven't seen further, it is by standing in the footprints of giants
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] An interesting paper on VM-friendly GC

2010-10-16 Thread Brandon S Allbery KF8NH
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

On 10/16/10 05:35 , Andrew Coppin wrote:
> GC languages are not exactly rare, so maybe we'll see some OSes start adding
> new system calls to allow the OS to ask the application whether there's any
> memory it can cheaply hand back. We'll see...

I thought Windows already had a system message for something like that.  Or
at least it used to, although I can see why it would have been removed or at
least deprecated.

Unix could do it with a signal, but in general the application can't easily
do that at times chosen by an external entity (consider that the act of
finding such memory could inadvertently *increase* memory pressure on the
system, since an application can't tell which of its pages aren't in core)

The correct solution is to give the application the tools necessary for it
to do its own memory management --- which is what the paper is about.

- -- 
brandon s. allbery [linux,solaris,freebsd,perl]  allb...@kf8nh.com
system administrator  [openafs,heimdal,too many hats]  allb...@ece.cmu.edu
electrical and computer engineering, carnegie mellon university  KF8NH
-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.10 (Darwin)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iEYEARECAAYFAky5yLEACgkQIn7hlCsL25UU/ACfXc8mmUeR2oIJMKGYSwd61JvM
qC0AoJ7BrEf0+ApE+Ohr4BnyqfqBCQ4q
=VBBc
-END PGP SIGNATURE-
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: How to make cabal pass flags to happy?

2010-10-16 Thread Thomas Schilling
You probably want to customise Setup.lhs to use defaultMainWithHooks
and add your own custom suffix handler to the UserHooks, see:

http://hackage.haskell.org/packages/archive/Cabal/1.8.0.6/doc/html/Distribution-Simple.html#t:UserHooks

Take a look at PPSuffixHandler
(http://hackage.haskell.org/packages/archive/Cabal/1.8.0.6/doc/html/Distribution-Simple-PreProcess.html#t:PPSuffixHandler)
and see the source code for ppHappy to get started

I'm CC'ing Duncan, in case he has a better idea.

On 13 October 2010 19:44, Niklas Broberg  wrote:
> On Fri, Oct 8, 2010 at 4:55 PM, Niklas Broberg  
> wrote:
>> Hi all,
>>
>> I want to do something I thought would be quite simple, but try as I
>> might I can't find neither information nor examples on how to achieve
>> it.
>>
>> What I want specifically is to have happy produce a GLR parser from my
>> .ly file, and I want this to happen during 'cabal install'. Which in
>> turn means I want cabal to pass the "--glr" flag to happy during
>> setup. My best guess is that I might want to use 'ppHappy' [1], or
>> something in the vicinity, but there's no documentation for the
>> combinator and it's far from obvious how to pass arguments to it.
>>
>> ... help? :-)
>
> ... anyone? ... please? :-)
>
> Cheers,
>
> /Niklas
> ___
> Haskell-Cafe mailing list
> Haskell-Cafe@haskell.org
> http://www.haskell.org/mailman/listinfo/haskell-cafe
>



-- 
Push the envelope. Watch it bend.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: A rant against the blurb on the Haskell front page

2010-10-16 Thread Ben Franksen
Ben Franksen wrote:
>> > That cutting edge research is done for Haskell as well as for its
>> > implementations is of course good to know, but just stating it is
>> > not nearly enough: such a statement must be corroberated with
>> > evidence, otherwise it is just idle marketing. (Not that there
>> > wouldn't be evidence amass, it's just that none is given.)
>> 
>> You literally want evidence that research played a part in Haskell, in
>> its opening statement? Why??
> 
> I reject this objection. 

Oops, translation error. I wanted to say: I withdraw this objection.

Cheers
Ben

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: A rant against the blurb on the Haskell front page

2010-10-16 Thread Ben Franksen
Christopher Done wrote:
> On 16 October 2010 05:52, Ben Franksen  wrote:
>> what marketing idiot has written this inclonclusive mumble-jumble of
>> buzz-words?
>> [...]
>> How can anyone write such a
>> nonsense? Haskell is not an "open source product"!
>> [...]
>> I am ashamed that it appears on the front page of my favourite
>> programming language.
>> [...]
>> But no, I forgot, we don't want to explain anything or even be
>> logical, dear reader, we want to pound slogans into your head!
> 
> Stand back everyone, Bill Hicks is back and he's got an axe to grind,
> and it looks rusty!

I am sorry about the wording of these sentences, especially teh first one. I
let myself get carried away.

I stand by the critique, though. The blurb mixes up too many things that
should be clearly separated.

> On 16 October 2010 07:49, Donn Cave  wrote:
>> " Haskell is a computer programming language. In particular, it is a
>> polymorphically statically typed, lazy, purely functional language,
>> quite different from most other programming languages. The language
>> is named for Haskell Brooks Curry, whose work in mathematical logic
>> serves as a foundation for functional languages. Haskell is based
>> on the lambda calculus, hence the lambda we use as a logo."
>>
>> This most succinctly expresses the points I tried to convey to him
>> about Haskell, and I don't think it would be out of place on the
>> main page.
> 
> This description is similar to Wikipedia's description of the Joy
> language, with samples from the blurb above spliced in:
> 
> "The Joy programming language is a purely functional programming
> language[, quite different from other programming languages.] It
> was produced by Manfred von Thun of La Trobe University in
> Melbourne, Australia. Joy is based on composition of functions
> rather than lambda calculus[, hence the composition operator
> we use as a logo.]"
> 
> These descriptions are fine, but they don't note how Haskell really is
> any different from other languages, like Joy.

It states very clearly how Haskell is very different from Java, C, C++,
Perl, Python, Ruby, etc.

Distinguishing it from exotic languages like Joy is important but can be
done in a later paragraph.

> It doesn't include the
> fact that Haskell is a very serious language: it has a comprehensive
> and stable implementation, growing community, growing and already
> large library set, is being used seriously in industry, is the focus
> of cutting edge parallelism and concurrency research, has many yearly
> conferences, hackathons, etc. The original blurb does mention these
> things.

All these things should be mentioned, but not before we say what Haskell
actually *is*.

> On 16 October 2010 09:09, Colin Paul Adams 
> wrote:
>> And "purely functional programming language"?
>>
>> If they mean anything to many people, it's that the language works
>> (i.e. functions). What language wouldn't work?
>>
>> I think Ben has a strong point here.
> 
> To solve this ambiguity that phrase is a link that people can click to
> find out what it means. "Object oriented", "dynamically typed",
> "stack-based" are about as meaningful.

The difference may be that everyone thinks he knows what 'object oriented'
means. But 'lazyness', 'polymorphic type system', what the heck is that?

I just think there is nothing we can do about that. These concepts are not
as well known as others. We can link to an explanation and we should, but
let's face it, if someone come to Haskell and expects to see only stuff he
already knows about he will be disappointed, no matter how well we hide
these things on the front page.

Cheers
Ben

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: A rant against the blurb on the Haskell front page

2010-10-16 Thread Ben Franksen
Donn Cave wrote:
> Quoth Ben Franksen ,
>> Enough. I think I have made my point.
> 
> Yes, though possibly a little overstated it.  While it's easy to share
> your distaste for the blurb, if you take a generous attitude towards it,
> most of it is "true enough."

Sorry. I was not in a generous mood, when I wrote this yesterday night. I am
more so now, and I agree that I have overstated soem points. It was not my
intention to attack the people who wrote the blurb, though it may have
sounded like I did. Again, sorry for that.

I agree with the "most of it is true enough". Certainly, but that is not my
problem with it.

My problem with it is how it is expressed and that it misses out on
important characteristics.

> The implementation specific features are at least widely available to
> anyone who wants to use the language on the most popular computing
> platforms, so it's expedient, if a little cheesy, to say that Haskell
> supports those features.

I hate this sort of expediency. And it doesn't become the front page of a
language that excells in being principled. Haskell did *not* give in to
expedience when it came to IO and side effects. And what would we have now
it it had done so? Something far inferior and less interesting, I am sure.

That said, I have no problem with mentioning the excellent properties of
existing implementations, even in the first (or maybe second) paragraph.
But I want the text to be explicit and honest about this.

> We agree about "strong support for integration with other languages",
> but I wouldn't like to say "strong support for integration with C",
> either.  The FFI is mostly independent of C, per se - outside of the
> hsc macros, it just addresses a sort of platform standard for exposed
> library functionality, which happens to be commonly implemented in C.
> Someone might be able to think of a better way to put that.

Yes, this is difficult to state w/o going into details. Maybe talk
about 'binding to system libraries' or something like that.

> The point I liked best is the one you started with:
> 
>> This blurb should, IMO, give a concise description of what Haskell, the
>> programming language, is, what makes it different from other languages,
>> and why I should be interested in it.
> 
> ... and, we understand, you don't find that in this blurb.  "Lazy" and
> "statically typed" may not be universally understood, but they aren't
> buzz words.  Whether that's the right way to shed some light on what
> Haskell is like, it sure says a lot more on a technical level than
> "advanced purely functional programming language."  And while that
> phrase is linked to a longer exposition of "Functional programming",
> the latter is set in language-independent terms and is at best ambiguous
> about whether it's talking about Haskell or not.

Yes, we have to be careful what we directly link to from the front page, and
especially from the blurb. These documents are almost as important as the
blurb itself.

They should concisely explain the main features for someone unfamiliar with
them, maybe contain some small examples. They definitely should be about
Haskell, not something more general, and also not something more specific,
like ghc, except if explicitly stated.

> I'm trying to picture someone who might find Haskell useful, but would
> be spooked by description of the language in unfamiliar technical
> terms.  Forget Python, this is a little different proposition.  A couple
> days ago I was talking to a friend about Haskell, turned out he hadn't
> heard of it.  I suppose he may have found this blurb.  I hope he
> found the blurb that appears at the top of the Introduction page:
> 
> " Haskell is a computer programming language. In particular, it is a
>   polymorphically statically typed, lazy, purely functional language,
>   quite different from most other programming languages. The language
>   is named for Haskell Brooks Curry, whose work in mathematical logic
>   serves as a foundation for functional languages. Haskell is based
>   on the lambda calculus, hence the lambda we use as a logo."
> 
> This most succinctly expresses the points I tried to convey to him
> about Haskell, and I don't think it would be out of place on the
> main page.

Much better. Though I *do* think mentioning the main implementations and
their qualities is a good thing to o, right after this:

"Haskell can be interpreted or compiled. It has high quality, mature
implementations [link to a list of implementations], including optimizing
compilers, interactive interpreters, profilers and tools for debugging, and
a large and rapidly growing body of libraries [link to hackage]. The most
important Haskell implementation, ghc [like to ghc page], has served as a
test bed for practical application of cutting egde research into the
language as well as its compilation to efficiently executable code."

The text should go on and mention concurrency, parallelism, ffi, and
whatnot.

Cheers
Ben

__

[Haskell-cafe] Re: A rant against the blurb on the Haskell front page

2010-10-16 Thread Ben Franksen
Don Stewart wrote:
> ben.franksen:
> > "Haskell is an advanced purely functional programming language."
> > 
> > Good start, if only the "advanced" were replaced with something more
> > characteristic, like "lazy", or "statically typed". Which, BTW, both
> > do not
> 
> "lazy" and "statically typed" don't mean much to other people. They are
> buzz words that mean nothing to many people.

I imagine a list of 'bullet points' following the blurb, listing the main
features with a small explanation and a link to further details.

> > appear in the whole blurb, even though they are *the*
> > characteristics of Haskell, lazyness being even something that sets
> > it apart from most other languages. I hear the marketeers crying
> > "but the average visitor has no idea what lazyness means". So what?
> > Give them a link to the wiki with an explanation. So, a better
> > introductory sentence would be
> > 
> > -> "Haskell is a lazily evaluated, purely functional programming
> > language with a very flexible and powerful static type system."
> 
> What are the benefits of laziness?

It is difficult to explain lazy evaluation and why it helps to make programs
more concise in one sentence. But it still is *the* most distinguishing
feature.

And let us be honest: it has benefits and it also has disadvantages. The
most prominent disadvantage being that it makes reasoning about runtime
behaviour, i.e. performance, much more difficult.

> > Next sentence:
> > 
> > "An open source product of more than twenty years of cutting edge
> > research, it allows rapid development of robust, concise, correct
> > software."
> 
> It is open source, and was born open source. It is the product of
> research.

How can a language be open source, or rather, how can it *not* be open
source? The point of a (programming) language is that it has a published
('open') definition. Nothing prevents anyone from creating a proprietary
compiler or interpreter for Haskell, AFAIK.

I do not dispute that Haskell is a reasult of research and I agree that this
should be stated. It is still not a 'product'. Have you ever heard someone
characterising TCP/IP as a 'product'? Or HTML? Or even C? The
term 'product', as it is used in the blurb, implies or at least suggests
some specific implementation, not a written and published standard.

> > This really gets me every time I read it. How can anyone write such
> > a nonsense? Haskell is not an "open source product"! It is no
> > product at all. That most (maybe all) implementations are opens
> > source is certainly an interesting fact, but IMO not something that
> > should appear at the top of the page right under the header "The
> > Haskell Programming Language". The second and third sentences
> > deliberately conflate language and implementation(s). This is a well
> > known falacy and I am ashamed that it
> 
> As Python, Ruby, C and every other language do.

Exactly my point. They all have only one relevant implementation. And in
practice this implementation has a great tendency to be taken as the
*definition* of the language, at least w.r.t. its semantics.

Do I really have to explain why this is bad and why specification of a
language should be clearly separate from its implementation? (Hint: how can
you ever hope to prove something about a piece of code if all the semantics
you have is "whatever the implementation does"?)

I *love* it that there *do* exist other implementations for Haskell, and I
dearly hope UHC, JHC etc will soon get to the point where they can be used
and recommended for practical programming.

> > That cutting edge research is done for Haskell as well as for its
> > implementations is of course good to know, but just stating it is
> > not nearly enough: such a statement must be corroberated with
> > evidence, otherwise it is just idle marketing. (Not that there
> > wouldn't be evidence amass, it's just that none is given.)
> 
> You literally want evidence that research played a part in Haskell, in
> its opening statement? Why??

I reject this objection. I let myself get carried away here. Still I think
that further down some links can be given to papers about Haskell and its
implementation.

> > On we go:
> > 
> > "With strong support for integration with other languages, built-in
> > concurrency and parallelism, debuggers, profilers, rich libraries
> > and an active community, Haskell makes it easier to produce
> > flexible, maintainable high-quality software."
> > 
> > Let us take that apart:
> > 
> > (1) Fact: Haskell has a good and very easy to use FFI. To the C
> > language. I have never heard of integration with any other langauge
> > being directly supported.
> 
> It is OK to contest these, but consider the FFI of our competition:
> Python, Ruby, Erlang. Woeful FFIs. You are not at risk using Haskell, as
>

Re: [Haskell-cafe] An interesting paper on VM-friendly GC

2010-10-16 Thread Thomas Schilling
On 16 October 2010 10:35, Andrew Coppin  wrote:
>  On 15/10/2010 11:50 PM, Gregory Crosswhite wrote:
>>
>>  On 10/15/2010 03:15 PM, Andrew Coppin wrote:
>>>
>>> On the other hand, their implementation uses a modified Linux kernel, and
>>> no sane person is going to recompile their OS kernel with a custom patch
>>> just to run Haskell applications, so we can't do quite as well as they did.
>>> But still, and interesting read...
>>>
>> Ah, but you are missing an important fact about the article:  it is not
>> about improving garbage collection for Haskell, it is about improving
>> collection for *Java*, which a language in heavy use on servers.  If this
>> performance gain really is such a big win, then I bet that it would highly
>> motivate people to make this extension as part of the standard Linux kernel,
>> at which point we could use it in the Haskell garbage collector.
>
> Mmm, that's interesting. The paper talks about "Jikes", but I have no idea
> what that is. So it's a Java implementation then?

Jikes as a virtual machine used for research, it actually has a decent
just in time compiler.  Its memory management toolkit (MMTk) also
makes it quite easy to experiment with new GC designs.

> Also, it's news to me that Java finds heavy use anywhere yet. (Then again,
> if they run Java server-side, how would you tell?)

Oh, it's *very* heavily used.  Many commercial products run on Java
both server and client.

> It seems to me that most operating systems are designed with the assumption
> that all the code being executed will be C or C++ with manual memory
> management. Ergo, however much memory the process has requested, it actually
> *needs* all of it. With GC, this assumption is violated. If you ask the GC
> nicely, it may well be able to release some memory back to you. It's just
> that the OS isn't designed to do this, so the GC has no idea whether it's
> starving the system of memory, or whether there's plenty spare.
>
> I know the GC engine in the GHC RTS just *never* releases memory back to the
> OS. (I imagine that's a common choice.) It means that if the amount of truly
> live data fluctuates up and down, you don't spend forever allocating and
> freeing memory from the OS. I think we could probably do better here.
> (There's an [ancient] feature request ticket for it somewhere on the
> Traq...) At a minimum, I'm not even sure how much notice the current GC
> takes of memory page boundaries and cache effects...

Actually that's been fixed in GHC 7.

> GC languages are not exactly rare, so maybe we'll see some OSes start adding
> new system calls to allow the OS to ask the application whether there's any
> memory it can cheaply hand back. We'll see...

I wouldn't be surprised if some OS kernels already have some
undocumented features to aid VM-friendly GC.  I think it's probably
going to have to be the other way around, though.  Not the OS should
ask for its memory back, but the application should ask for the page
access bits and then decide itself (as done in the paper).  I don't
know how that interacts with the VM paging strategy, though.
Microkernels such as L4 already support these things (e.g., L4 using
the UNMAP system call).  Xen and co. probably have something similar.


-- 
Push the envelope. Watch it bend.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Eta-expansion and existentials (or: types destroy my laziness)

2010-10-16 Thread Max Bolingbroke
On 16 October 2010 12:16, Roman Leshchinskiy  wrote:
> eta :: Stream a -> Stream a
> eta s = Stream s next
>   where
>     next (Stream s next') = case next' s of
>                               Just (x,s') -> Just (x,Stream s' next')
>                               Nothing     -> Nothing
>
> Making GHC optimise stream code involving eta properly is hard :-)

Good point, I don't exactly mean non-recursive for requirement 3) then
- I mean an adjective with a fuzzier definition like "GHC-optimisable"
:-)

Max
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Eta-expansion and existentials (or: types destroy my laziness)

2010-10-16 Thread Roman Leshchinskiy

On 16/10/2010, at 12:00, Max Bolingbroke wrote:

> Hi Cafe,
> 
> I've run across a problem with my use of existential data types,
> whereby programs using them are forced to become too strict, and I'm
> looking for possible solutions to the problem.
> 
> I'm going to explain what I mean by using a literate Haskell program.
> First, the preliminaries:
> 
>> {-# LANGUAGE ExistentialQuantification #-}
>> import Control.Arrow (second)
>> import Unsafe.Coerce
> 
> Let's start with a simple example of an existential data type:
> 
>> data Stream a = forall s. Stream s (s -> Maybe (a, s))
> 
> [...]
> In fact, to define a correct cons it would be sufficient to have some
> function (eta :: Stream a -> Stream a) such that (eta s) has the same
> semantics as s, except that eta s /= _|_ for any s.

That's easy.

eta :: Stream a -> Stream a
eta s = Stream s next
   where
 next (Stream s next') = case next' s of
   Just (x,s') -> Just (x,Stream s' next')
   Nothing -> Nothing

Making GHC optimise stream code involving eta properly is hard :-)

Roman


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Eta-expansion and existentials (or: types destroy my laziness)

2010-10-16 Thread Max Bolingbroke
Hi Cafe,

I've run across a problem with my use of existential data types,
whereby programs using them are forced to become too strict, and I'm
looking for possible solutions to the problem.

I'm going to explain what I mean by using a literate Haskell program.
First, the preliminaries:

> {-# LANGUAGE ExistentialQuantification #-}
> import Control.Arrow (second)
> import Unsafe.Coerce

Let's start with a simple example of an existential data type:

> data Stream a = forall s. Stream s (s -> Maybe (a, s))

This is a simplified version of the type of streams from the stream
fusion paper by Coutts et al. The idea is that if you have a stream,
you have some initial state s,
which you can feed to a step function. The step function either says
"Stop! The stream has ended!", or yields an element of the stream and
an updated state.

The use of an existential quantifier is essential to this code,
because it means that we can write most functions on Streams in a
non-recursive fashion. This in turn makes them amenable to inlining
and simplification by GHC, which gives the us the loop fusion we need.

An example stream is that of natural numbers:

> nats :: Stream Int
> nats = Stream 0 (\n -> Just (n, n + 1))

Here, the state is just the next natural number to output.

We can also build the list constructors as functions on streams:

> nil :: Stream a
> nil = Stream () (const Nothing)

> cons :: a -> Stream a -> Stream a
> cons a (Stream s step) = Stream Nothing (maybe (Just (a, Just s)) (fmap 
> (second Just) . step))

List functions can also easily be expressed:

> taken :: Int -> Stream a -> Stream a
> taken n (Stream s step) = Stream (n, s) (\(n, s) -> if n <= 0 then Nothing 
> else maybe Nothing (\(a, s) -> Just (a, (n - 1, s))) (step s))

To see this all in action, we will need a Show instance for streams.
Note how this code implements a loop where it repeatedly steps the
stream (starting from
the initial state):

> instance Show a => Show (Stream a) where
> showsPrec _ (Stream s step) k = '[' : go s
>   where go s = maybe (']' : k) (\(a, s) -> shows a . showString ", " $ go 
> s) (step s)

We can now run code like this:

> test1 = do
> print $ (nil :: Stream Int)-- []
> print $ cons 1 nil -- [1, ]
> print $ taken 1 $ cons 1 $ cons 2 nil  -- [1, ]

Now we may wish to define infinite streams using value recursion:

> ones :: Stream Int
> ones = cons 1 ones

Unfortunately, 'ones' is just _|_! The reason is that cons is strict
in its second argument. The problem I have is that there is no way to
define cons which is
simultaneously:

  1. Lazy in the tail of the list
  2. Type safe
  3. Non-recursive

If you relax any of these constraints it becomes possible. For
example, if we don't care about using only non-recursive functions we
can get the cons we want
by taking a roundtrip through the skolemized (existiental-eliminated -
see http://okmij.org/ftp/Computation/Existentials.html) version of
streams:

> data StreamSK a = StreamSK (Maybe (a, StreamSK a))
>
> skolemize :: Stream a -> StreamSK a
> skolemize (Stream s step) = StreamSK (fmap (\(a, s') -> (a, skolemize (Stream 
> s' step))) $ step s)
>
> unskolemize :: StreamSK a -> Stream a
> unskolemize streamsk = Stream streamsk (\(StreamSK next) -> next)
>
> instance Show a => Show (StreamSK a) where
> showsPrec _ (StreamSK mb_next) k = maybe (']' : k) (\(a, ssk) -> shows a 
> . showString ", " $ shows ssk k)  mb_next

Now we can define:

> cons_internally_recursive x stream = cons x (unskolemize (skolemize stream))

This works because unskolemize (skolemize stream) != _|_ even if
stream is bottom. However, this is a non-starter because GHC isn't
able to fuse together the (recursive) skolemize function with any
consumer of it (e.g. unskolemize).

In fact, to define a correct cons it would be sufficient to have some
function (eta :: Stream a -> Stream a) such that (eta s) has the same
semantics as s, except
that eta s /= _|_ for any s. I call this function eta because it
corresponds to classical eta expansion. We can define a type class for
such operations with a number
of interesting instances:

> class Eta a where
> -- eta a /= _|_
> eta :: a -> a
>
> instance Eta (a, b) where
> eta ~(a, b) = (a, b)
>
> instance Eta (a -> b) where
> eta f = \x -> f x
>
> instance Eta (StreamSK a) where
> eta ~(StreamSK a) = StreamSK a

If we had an instance for Eta (Stream a) we could define a lazy cons function:

> cons_lazy :: a -> Stream a -> Stream a
> cons_lazy x stream = cons x (eta stream)

As we have already seen, one candidate instance is that where eta =
unskolemize . skolemize, but I've already ruled that possibility out.
Given that constraint, the
only option that I see is this:

> instance Eta (Stream a) where
> -- Doesn't type check, even though it "can't go wrong":
> --eta stream = Stream (case stream of Stream s _ -> s) (case stream of 
> Stream _ step -> step)
> eta stream = Stre

Re: [Haskell-cafe] A rant against the blurb on the Haskell front page

2010-10-16 Thread Andrew Coppin

 On 16/10/2010 09:02 AM, Stephen Tetley wrote:

On 16 October 2010 08:09, Colin Paul Adams  wrote:


And "purely functional programming language"?

If they mean anything to many people, it's that the language works
(i.e. functions). What language wouldn't work?

I think Ben has a strong point here.

If a "functional language" doesn't mean anything significant then
Haskell probably isn't the language you should be choosing.


By that rationale, I should never have chosen Haskell. (I'm really glad 
I did though...)



In the UK some time before Haskell, I believe there was some effort to
re-brand "functional programming" to "applicative programming" to make
a distinction with functional - "actually works!" - and (first order-)
functions in C or Pascal that were like procedures but returned a
result. This was before my time, but I'm sure I saw evidence in
reports at my old university library for grant proposals / research
awards to put applicative programming on parallel machines.


I've always thought "function-oriented programming" (by analogy to 
"object-oriented programming") to be a far more illunimating term. But 
of course, as son as you do that, anybody who knows about "functional 
programming" will wonder if "function-oriented programming" is a 
different animal somehow... It seems that for good or ill, we're stuck 
with the existing terminology.


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] An interesting paper on VM-friendly GC

2010-10-16 Thread Andrew Coppin

 On 15/10/2010 11:50 PM, Gregory Crosswhite wrote:

 On 10/15/2010 03:15 PM, Andrew Coppin wrote:
On the other hand, their implementation uses a modified Linux kernel, 
and no sane person is going to recompile their OS kernel with a 
custom patch just to run Haskell applications, so we can't do quite 
as well as they did. But still, and interesting read...


Ah, but you are missing an important fact about the article:  it is 
not about improving garbage collection for Haskell, it is about 
improving collection for *Java*, which a language in heavy use on 
servers.  If this performance gain really is such a big win, then I 
bet that it would highly motivate people to make this extension as 
part of the standard Linux kernel, at which point we could use it in 
the Haskell garbage collector.


Mmm, that's interesting. The paper talks about "Jikes", but I have no 
idea what that is. So it's a Java implementation then?


Also, it's news to me that Java finds heavy use anywhere yet. (Then 
again, if they run Java server-side, how would you tell?)


It seems to me that most operating systems are designed with the 
assumption that all the code being executed will be C or C++ with manual 
memory management. Ergo, however much memory the process has requested, 
it actually *needs* all of it. With GC, this assumption is violated. If 
you ask the GC nicely, it may well be able to release some memory back 
to you. It's just that the OS isn't designed to do this, so the GC has 
no idea whether it's starving the system of memory, or whether there's 
plenty spare.


I know the GC engine in the GHC RTS just *never* releases memory back to 
the OS. (I imagine that's a common choice.) It means that if the amount 
of truly live data fluctuates up and down, you don't spend forever 
allocating and freeing memory from the OS. I think we could probably do 
better here. (There's an [ancient] feature request ticket for it 
somewhere on the Traq...) At a minimum, I'm not even sure how much 
notice the current GC takes of memory page boundaries and cache effects...


GC languages are not exactly rare, so maybe we'll see some OSes start 
adding new system calls to allow the OS to ask the application whether 
there's any memory it can cheaply hand back. We'll see...


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] ANNOUNCE: DocTest-0.1.0

2010-10-16 Thread Simon Hengel
I'm very excited to announce a new version of DocTest[1]. DocTest now
uses Haddock for parsing of comments. Examples are now marked-up with
Haddocks newly introduced support for "examples of interaction"[2][3].

A very basic example of usage is at [4].

DocTest is still experimental. Suggestions and patches are gladly
welcome!

Cheers,
Simon

[1] http://hackage.haskell.org/cgi-bin/hackage-scripts/package/DocTest
[2] http://www.haskell.org/haddock/CHANGES.txt
[3] http://www.haskell.org/haddock/doc/html/ch03s08.html#id566093
[4] http://haskell.org/haskellwiki/DocTest
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Strict Core?

2010-10-16 Thread Max Bolingbroke
Hi Gregory,

On 15 October 2010 22:27, Gregory Crosswhite  wrote:
> Out of curiosity, are there any plans for GHC to eventually use the Strict
> Core language described in
> http://www.cl.cam.ac.uk/~mb566/papers/tacc-hs09.pdf?

I do not have plans to add it. I think it would be worth it - perhaps
worth a few % points in runtime - but I've started researching
supercompilation instead, which has more impressive effects :-)

Simon has said he is keen to use it though - it's just a big
engineering task to replumb GHC to use it, so perhaps this is a
project for an enterprising student.

That said, I've been told that UHC's core language uses the ideas from
Strict Core, and they have/had a student at Utretch (Tom Lokhorst) who
was working on implementing optimisations like arity raising and deep
unboxing for the language.

Cheers,
Max
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] A rant against the blurb on the Haskell front page

2010-10-16 Thread Christopher Done
On 16 October 2010 05:52, Ben Franksen  wrote:
> what marketing idiot has written this inclonclusive mumble-jumble of 
> buzz-words?
> [...]
> How can anyone write such a
> nonsense? Haskell is not an "open source product"!
> [...]
> I am ashamed that it appears on the front page of my favourite
> programming language.
> [...]
> But no, I forgot, we don't want to explain anything or even be
> logical, dear reader, we want to pound slogans into your head!

Stand back everyone, Bill Hicks is back and he's got an axe to grind,
and it looks rusty!

On 16 October 2010 07:49, Donn Cave  wrote:
> " Haskell is a computer programming language. In particular, it is a
>  polymorphically statically typed, lazy, purely functional language,
>  quite different from most other programming languages. The language
>  is named for Haskell Brooks Curry, whose work in mathematical logic
>  serves as a foundation for functional languages. Haskell is based
>  on the lambda calculus, hence the lambda we use as a logo."
>
> This most succinctly expresses the points I tried to convey to him
> about Haskell, and I don't think it would be out of place on the
> main page.

This description is similar to Wikipedia's description of the Joy
language, with samples from the blurb above spliced in:

"The Joy programming language is a purely functional programming
language[, quite different from other programming languages.] It
was produced by Manfred von Thun of La Trobe University in
Melbourne, Australia. Joy is based on composition of functions
rather than lambda calculus[, hence the composition operator
we use as a logo.]"

These descriptions are fine, but they don't note how Haskell really is
any different from other languages, like Joy. It doesn't include the
fact that Haskell is a very serious language: it has a comprehensive
and stable implementation, growing community, growing and already
large library set, is being used seriously in industry, is the focus
of cutting edge parallelism and concurrency research, has many yearly
conferences, hackathons, etc. The original blurb does mention these
things.

On 16 October 2010 09:09, Colin Paul Adams  wrote:
> And "purely functional programming language"?
>
> If they mean anything to many people, it's that the language works
> (i.e. functions). What language wouldn't work?
>
> I think Ben has a strong point here.

To solve this ambiguity that phrase is a link that people can click to
find out what it means. "Object oriented", "dynamically typed",
"stack-based" are about as meaningful.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Bug in HTTP (bad internal error handling)

2010-10-16 Thread Claus Reinke

After it catches this error, the function returns (line 376):

return (fail (show e))

The "fail" is running in the Either monad (The Result type = Either).
This calls the default Monad implementation of fail, which is just a
call to plain old error. This basically causes the entire program to
crash.



Actually, it appears that simpleHTTP isn't actually supposed to throw
an IOException, and it is instead supposed to return a ConnError
result. So the real fix is to fix the code to make this happen. But


Sounds like a victim of 


   http://hackage.haskell.org/trac/ghc/ticket/4159

For mtl clients, 'fail' for 'Either' used to call 'Left'. That was
changed, though the ticket does not indicate the library
versions affected.

Claus

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] A rant against the blurb on the Haskell front page

2010-10-16 Thread Stephen Tetley
On 16 October 2010 09:02, Stephen Tetley  wrote:

> On the main topic - I think the blurb is fine. If Python and Ruby want
> to do proselytization and value judgements please leave them to it.

PS - Were it me, I would drop the third sentence of the Haskell.org
blurb, to me it is a value judgement - "easier" than what?
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Golfing parsec code

2010-10-16 Thread Jacek Generowicz
I'm trying to write a utility which is able to read Ratios from two  
distinct formats


   "2 / 3"  -> 2 % 3

   "4"  ->  4 % 1

I'm sure that a skilled Haskell programmer could vastly improve on my  
code (do I even need Parsec?), so I've come to solicit advice.



import Text.ParserCombinators.Parsec
import qualified Text.ParserCombinators.Parsec.Token as P
import Text.Parsec.Language (haskellDef)

import Data.Ratio


readFraction :: String -> Either ParseError (Ratio Integer)
readFraction = parse ratio "-"


lexer = P.makeTokenParser haskellDef

integer = P.integer lexer
natural = P.natural lexer
whiteSpace = P.whiteSpace lexer

ratio = try frac <|> integer'

integer' = do
  i <- integer
  return $ fromInteger i

frac = do
  whiteSpace
  num <- natural
  char '/'
  den <- integer
  return (num % den)

f = readFraction "  3  /  2  "
i = readFraction " 6  "

main = print [f,i]

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] A rant against the blurb on the Haskell front page

2010-10-16 Thread Stephen Tetley
On 16 October 2010 08:09, Colin Paul Adams  wrote:

> And "purely functional programming language"?
>
> If they mean anything to many people, it's that the language works
> (i.e. functions). What language wouldn't work?
>
> I think Ben has a strong point here.

If a "functional language" doesn't mean anything significant then
Haskell probably isn't the language you should be choosing.

In the UK some time before Haskell, I believe there was some effort to
re-brand "functional programming" to "applicative programming" to make
a distinction with functional - "actually works!" - and (first order-)
functions in C or Pascal that were like procedures but returned a
result. This was before my time, but I'm sure I saw evidence in
reports at my old university library for grant proposals / research
awards to put applicative programming on parallel machines.

Caveat - my university didn't do any research on this but it did have
government reports of computer matters stored amongst the programming
books, one obviously with a title "functional" or "applicative" enough
to catch my interest.

On the main topic - I think the blurb is fine. If Python and Ruby want
to do proselytization and value judgements please leave them to it.

Best wishes

Stephen
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Bug in HTTP (bad internal error handling)

2010-10-16 Thread Bit Connor
Hello, I have reported this problem to the maintainer of the HTTP
package about 2 weeks ago, but have not yet received a response, so I
am reporting it here.

I am using a recent git check out of HTTP 4000.0.10

The bufferReadLine function from Network.TCP has a bug in how it
handles an IOException after it catches one. It checks for isEOFError
e, and that branch of code seems to be ok. The problem is how it
handles other types of IO errors. One type of such an error, for
example, is:

: Data.ByteString.hGetLine: resource vanished (Connection
reset by peer)

After it catches this error, the function returns (line 376):

return (fail (show e))

The "fail" is running in the Either monad (The Result type = Either).
This calls the default Monad implementation of fail, which is just a
call to plain old error. This basically causes the entire program to
crash.

A simple fix, is to instead just reraise the original error:

ioError e

This will propagate the error through as a normal IOException, which
can then be caught by the caller (such as the caller of simpleHTTP).

By looking at the library code, the same bug seems to be in other
places (bufferGetBlock, bufferPutBlock), but I have only observed the
bug actually occurring in bufferReadLine.

Actually, it appears that simpleHTTP isn't actually supposed to throw
an IOException, and it is instead supposed to return a ConnError
result. So the real fix is to fix the code to make this happen. But
I've found simpleHTTP to throw an IOException in a lot of
circumstances, so for now I think the fix above is a good immediate
solution.

Thanks,
Bit Connor
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] A rant against the blurb on the Haskell front page

2010-10-16 Thread Colin Paul Adams
> "Don" == Don Stewart  writes:

>> Let me explain.
>> 
>> "Haskell is an advanced purely functional programming language."
>> 
>> Good start, if only the "advanced" were replaced with something
>> more characteristic, like "lazy", or "statically typed". Which,
>> BTW, both do not

Don> "lazy" and "statically typed" don't mean much to other
Don> people. They are buzz words that mean nothing to many people.

And "purely functional programming language"?

If they mean anything to many people, it's that the language works
(i.e. functions). What language wouldn't work? 

I think Ben has a strong point here.
-- 
Colin Adams
Preston Lancashire
()  ascii ribbon campaign - against html e-mail
/\  www.asciiribbon.org   - against proprietary attachments
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe