Re: Standard Haskell
On 9/8/98 5:10 PM, Andrew Rock wrote If Standard Haskell is meant to be a stable target for texts and the like, why not Haskell-Ed (for Education), perhaps with a version indication like Haskell-Ed-98. Unfortunately, this carries the risk that the uninformed may think that the language was named after Eddie "That's a very nice sweater, Mrs. Cleaver" Haskell. - Jim Hassett
Re: Standard Haskell
I think I favor "20th century Haskell" myself :-) Hassett wrote: On 9/8/98 5:10 PM, Andrew Rock wrote If Standard Haskell is meant to be a stable target for texts and the like, why not Haskell-Ed (for Education), perhaps with a version indication like Haskell-Ed-98. Unfortunately, this carries the risk that the uninformed may think that the language was named after Eddie "That's a very nice sweater, Mrs. Cleaver" Haskell. - Jim Hassett
Re: Standard Haskell
People seem to be forgetting the long-standing tradition of Algol (60), Fortran (66, 77, 90) ...not to mention Algol W, S-algol, PS-algol and H Level FORTRAN... If Simon worked for IBM he could call it FP/I, in the tradition of PL/I. So why not Haskell-1, to be followed by Haskell-2, or even Haskell-A... I'm almost missing incomprehensible discourse on the Haskell type system... Greg Michaelson
Re: Standard Haskell
Why not Haskell I? (as the first "standard" form of the language)... --Artie
Re: Standard Haskell
People seem to be forgetting the long-standing tradition of Algol (60), Fortran (66, 77, 90) and, no doubt, many other fine languages in their use of 2-digit year qualifiers. 98/99 sounds good to me. On Mon, 7 Sep 1998, Simon Peyton-Jones wrote: * Incidentally, I'm leaning towards 'Haskell 98' as the name. Was it Bill Gates that suggested this to you? :-) At 12:00 +0100 98/09/08, Stephen H. Price wrote: a) Haskell 1998 would be more appropriate in the light of Year 2000 problems. Probably in order to avoid Haskell 2000 being confused with Haskell 1900. :-) Otherwise, since it is rather late this year, it should be called Haskell 1999. Those that use it will feel the next year that they have an up-to-date version. Hans Aberg * Email: Hans Aberg mailto:[EMAIL PROTECTED] * Home Page: http://www.matematik.su.se/~haberg/ * AMS member listing: http://www.ams.org/cml/ Any views expressed in this message are those of the individual sender, except where the sender specifically states them to be the views of Reuters Ltd.
Re: Standard Haskell: More lexical/syntactic issues (from Alastair Reid)
On the Standard Haskell site, Alastair Reid wrote: One of the goals of Standard Haskell was to simplify the language - removing traps and making it easier to teach/learn. We've seen very little work on this, so I'd like to make the following proposal: Let's remove all the little syntactic shortcuts which only save a few characters but require you to understand something new or to learn what the default is. 1) Fixity declarations usually look like this: infixl 6 +, - but you can omit the precedence digit and write this instead: infixl +, - The programmer who uses this avoids having to type 2 characters. The programmers who'sre reading this code has to learn that the precedence digit can be omitted (I didn't believe it when I first saw it) and then look up the default precedence in the report. I think it is harder to understand programs that use this shortcut and that it should be removed. I don't think it's harder. Even if the number is specified explicitly, I would *still* have to look up the precedence table in the report, or at least grep the source for the standard prelude, because I don't know what the precedence of the other operators is. 2) Deriving lists come in two flavours: data ... deriving Eq data ... deriving (Eq,Ord) ... I think we should eliminate the first style. I could live with that. 3) Empty contexts are not allowed. It's fine to write: f :: (Ord a) = a - a - Bool but you can't write: f :: () = a - a - Bool I think we should relax the syntax to allow empty contexts. Who would ever write them? Even for programs that generate Haskell code, it's trivial to handle the empty context case differently. 3) Contexts come in two flavours: f :: Ord a = a - a - Bool and f :: (Ord a, Bounded a) = a - a - Bool Again, we save 2 characters but then we have to explain both forms of syntax. And, again, it's inconsistent to let you omit the parens in contexts but force you to put them in in import-export lists. Change: delete one line from the grammar and half a sentence from 4.1.2. I could live with that, but it might break a lot of existing code. 4) Module headers can be omitted. If the module leaves out the module header, the header module Main(main) where is assumed. This saves 23 characters of typing (17 if you omit the export list, 14 if you call the module A instead). When teaching Haskell, you still have to explain about modules because your Haskell compiler will refer to the module "Main" in error messages. Fix the compilers. If there's no module header, the compiler should not include the module name (Main) in the error messages. 5) Lists of constructors can be empty in export lists but can't be empty in import lists. ... I think this is a simple typo. Change: replace n=1 with n=0 in the grammar rule for import. This one I support. -- Fergus Henderson [EMAIL PROTECTED] | "I have always known that the pursuit WWW: http://www.cs.mu.oz.au/~fjh | of excellence is a lethal habit" PGP: finger [EMAIL PROTECTED]| -- the last words of T. S. Garp.
Re: Standard Haskell: More lexical/syntactic issues (from Alastair Reid)
If you want a functional scripting language with H-M type inference and type classes and monads, that's great, but maybe it should be something separate from Haskell. I have been promoting Haskell exactly for this purpose for some time now, and I don't buy your points, e.g that in a scripting language, it is desirable to have lots of defaults, etc. so that there is as little time as possible between the time you start to write a program and the time you get it running. Scripting languages should optimize programmer's time, by making the language small, simple, readable, type-safe, .. I think that all of Alastair's points help reducing that.
Re: Standard Haskell: More lexical/syntactic issues (from Alastair Reid)
On 24-Jun-1998, Frank A. Christoph [EMAIL PROTECTED] wrote: 4) Module headers can be omitted. If the module leaves out the module header, the header module Main(main) where is assumed. [and that's a mistake] Fix the compilers. If there's no module header, the compiler should not include the module name (Main) in the error messages. What do you propose they should use in its stead? "Type error in the module formerly known as Main"? ;) Just "Type error". No need to point out which module it is in if there's only one module. It may be reasonable to require module headers, but I don't think poor compiler diagnostics is a sufficiently good reason, since it really isn't that hard to just fix the compiler diagnostics. -- Fergus Henderson [EMAIL PROTECTED] | "I have always known that the pursuit WWW: http://www.cs.mu.oz.au/~fjh | of excellence is a lethal habit" PGP: finger [EMAIL PROTECTED]| -- the last words of T. S. Garp.
RE: Standard Haskell: More lexical/syntactic issues (from Alastair Reid)
[I'm replying to both Fergus and Alastair in this message.] This is a reply to Fergus Henderson's comments on my proposal. My answer to all his comments is that consistent languages are easier to learn than languages littered with exceptions, special cases and random default behaviour. On the one hand, Haskell has so much syntactic sugar that I am skeptical as to whether it is really possible to eliminate all of these kinds of problems. On the other hand, maybe we can keep the syntactical redundancies while eliminating exceptional behavior. 1) Fixity declarations usually look like this: infixl 6 +, - but you can omit the precedence digit and write this instead: infixl +, - [which is bad...] I don't think it's harder. Even if the number is specified explicitly, I would *still* have to look up the precedence table in the report, or at least grep the source for the standard prelude, because I don't know what the precedence of the other operators is. I was surprised to learn that this kind of declaration is possible. It's a safe bet that most other people would be too. Standard Haskell is supposed to be Haskell 1.4, but streamlined. If you can eliminate a rarely used feature, I think you should. If you can reduce the size of the grammar, I think you should. BTW, if I had seen this first in somebody else's source code rather than here on the list, and it compiled, my first impulse would have been that there must be a bug in the compiler that accepted it. Then, after a few seconds, I would maybe calm down, check the report... and send a message to this list about it. 3) Empty contexts are not allowed. Who would ever write them? Even for programs that generate Haskell code, it's trivial to handle the empty context case differently. Probably not many people - but it's still a pointless exception and you have to remember to handle that empty case differently. See below. 3) Contexts come in two flavours: f :: Ord a = a - a - Bool and f :: (Ord a, Bounded a) = a - a - Bool [and that's bad] I could live with that, but it might break a lot of existing code. If I understand this correctly, you want to require the parentheses. I believe HBC's grammar needs (or needed---maybe it's fixed now) them, and I remember when I was writing code for HBC that the extra two keystrokes were not such a great burden. If you are going to require this, I think you should definitely allow #2 above also. 4) Module headers can be omitted. If the module leaves out the module header, the header module Main(main) where is assumed. [and that's a mistake] Fix the compilers. If there's no module header, the compiler should not include the module name (Main) in the error messages. What do you propose they should use in its stead? "Type error in the module formerly known as Main"? ;) That's be nice AS WELL but why not simplify the report by removing pointless defaults. There's an argument going around that it must be possible teach Haskell without having to mention the word "module" in the first month. This argument is used to justify reexporting all kinds of rubbish from the Prelude (and is something I have argued against in the Standard Haskell discussion). Larry Paulson has been lauded for introducing and using modules much earlier in the second edition of his book "ML for the Working Programmer", and that book is often used as an introduction to FP by beginners (despite the title). I agree that it is "cleaner" from the teacher's standpoint to avoid mention of modules in the beginning, but so what? Haskell is supposed to be a language suitable both for education and programming in the large (PITL). Fine. But nobody said it had to be a scripting language and, although I think functional languages are great for that purpose, Haskell should not be both a scripting language and a language for PITL at the same time. Furthermore: a language suitable for scripting is not necessarily suitable for education, nor vice versa. Certainly no one would start off a bunch of freshmen on Perl! It seems to me that many of the syntactic oddities above might be viewed as symptoms of conflating one with the other. For example, in a scripting language, it is desirable to have lots of defaults, etc. so that there is as little time as possible between the time you start to write a program and the time you get it running. That's possible because scripting languages are often targeted at one specific domain, and so you can choose your defaults accordingly. But Haskell is supposed to be a general-purpose language, so default behavior is not much of a benefit. If you want a functional scripting language with H-M type inference and type classes and monads, that's great, but maybe it should be something separate from Haskell. --FC
Re: Standard Haskell: More lexical/syntactic issues (from Alastair Reid)
Fergus Henderson [EMAIL PROTECTED] writes: On the Standard Haskell site, Alastair Reid wrote: 1) Fixity declarations usually look like this: infixl 6 +, - but you can omit the precedence digit and write this instead: infixl +, - The programmer who uses this avoids having to type 2 characters. The programmers who'sre reading this code has to learn that the precedence digit can be omitted (I didn't believe it when I first saw it) and then look up the default precedence in the report. I think it is harder to understand programs that use this shortcut and that it should be removed. I don't think it's harder. Even if the number is specified explicitly, I would *still* have to look up the precedence table in the report, or at least grep the source for the standard prelude, because I don't know what the precedence of the other operators is. I think you're missing the point. Omitting the precedence digit is important because it allows the programmer to avoid making a decision about something he doesn't really care about. Most of the time, you're not interested in the relative precedence of `thenP` vs. (+), since it doesn't make any sense to mix them. If you really *want* a precedence of 9 (or whatever the default is), one would never dream of leaving it out of the declaration. ObStandardHaskellProposal: relax the restriction on precedences being in the range 0-9. Change the precedences of the Prelude operators from n to n*100. Cheers, Simon -- Simon Marlow [EMAIL PROTECTED] University of Glasgow http://www.dcs.gla.ac.uk/~simonm/ finger for PGP public key
Re: Standard Haskell: More lexical/syntactic issues (from Alastair Reid)
I think you're missing the point. Omitting the precedence digit is important because it allows the programmer to avoid making a decision about something he doesn't really care about. Most of the time, you're not interested in the relative precedence of `thenP` vs. (+), since it doesn't make any sense to mix them. If you really *want* a precedence of 9 (or whatever the default is), one would never dream of leaving it out of the declaration. ObStandardHaskellProposal: relax the restriction on precedences being in the range 0-9. Change the precedences of the Prelude operators from n to n*100. A minor variation of which is to allow floating point numbers instead so that you can always squeeze a new operator in between two existing ones. And a major variation (which gets to the root of your response) is to replace the total order with a partial order. I think Lennart Augustsson suggested this a long time ago - are you there Lennart? -- Alastair Reid Yale Haskell Project Hacker [EMAIL PROTECTED] http://WWW.CS.Yale.EDU/homes/reid-alastair/
Re: Standard Haskell: More lexical/syntactic issues (from Alastair Reid)
Frank A. Christoph wrote: If you want a functional scripting language with H-M type inference and type classes and monads, that's great, but maybe it should be something separate from Haskell. Haskell is, according to my experiences with tool integration, the ultimate scripting language around, and for several reasons: 1) Can be compiled or interpreted (no reason to rewrite your glue to get a fast production system) 2) Computations as first class, strongly typed values with loads of combinators for scripting the world with less code. 3) Extensible with new computations and combinators (BYO computational model so to say, such as computations for constructing HTML docs using a combination of the IO monad and a state transformer monad, or an approach to concurrency like CML's implemented on top of Concurrent Haskell) 4) Extensible through foreign language interfaces like Green Card 5) Excellent support for concurrency, e.g. in the style of CML, which makes the (multi-threaded) glue appear at a very high level of abstraction. 6) Excellent support for string manipulation (map,fold,filter,++ ..) 7) The parsing/unparsing plumbing can be hidden by instantiations of class Read and Show 8) Superb features for parsing messages and output generated by controlled tools (using Meurig Sages regexp library or Happy) 9) The type of local names do not need to be declared due to type inference. 10) Semicolons can be omitted by careful use of the layout rule 11) Parameters to a function call can be given without parenthesis. (9) - (10) are typical "features" of a scripting language that we get for free in Haskell. What is "missing" is things like unquoted strings like in Tcl (they soon turn out to be a real pain in the neck) or functions with variable number of arguments (i.e. default expressions as in Ada). I remember to have seen type systems for ML providing the last feature. Other features that can contribute according to my opinion: reflection in the sense of e.g. APL, i.e. an eval :: String - a operator which could be useful when embedding Haskell logic in other programs, i.e. by letting Hugs run as an execution enginee controlled by a foreign tool. Existential types and extendable record types would be great as well in setting up abstractions such as generic event brokers and local name servers. They would also come in useful in constructing user interfaces for collections of heterogeneous objects (dragdrop areas for example). Having spend the last 3 years of my professional (and to some extent private) life integrating tools using Haskell, I'm (almost) perfectly happy. Einar BTW: Moreover, I strongly promote to incorporate fully orthogonal persistence. Sic!
Re: Standard Haskell: More lexical/syntactic issues (from Alastair Reid)
This is a reply to Fergus Henderson's comments on my proposal. My answer to all his comments is that consistent languages are easier to learn than languages littered with exceptions, special cases and random default behaviour. 1) Fixity declarations usually look like this: infixl 6 +, - but you can omit the precedence digit and write this instead: infixl +, - [which is bad...] I don't think it's harder. Even if the number is specified explicitly, I would *still* have to look up the precedence table in the report, or at least grep the source for the standard prelude, because I don't know what the precedence of the other operators is. Grepping the Prelude won't tell you what the default precedence is - so already the naive user is getting frustrated. Then they remember that their professor generously provided copies of tables 1 and 2 from the report (which summarise the precedence of various bits of syntax and functions) along with a few other useful bits of info about Haskell. But it's not there either - their professor didn't even know the precedence digit was optional because they'd never seen it being used in any programs. And besides, they don't want to list all the special cases in the Haskell syntax - life is too short. Then they grab a copy of their class textbook. Not there either - I guess the textbook writer didn't want to distract their readers with that sort of unimportant nonsense. So finally, they reach for the report and they find it. Of course, I'd have reached for my copy of the report right away. The naive user is very likely to try the easiest/shortest document first and leave the Haskell report till last. 3) Empty contexts are not allowed. Who would ever write them? Even for programs that generate Haskell code, it's trivial to handle the empty context case differently. Probably not many people - but it's still a pointless exception and you have to remember to handle that empty case differently. If I was writing a program that generated Haskell code, I might well assume that empty contexts are allowed. I might try my system with one compiler and find that that particular compiler relaxed this restriction. I release it to the public, I get a bug report saying that it doesn't work with compiler X, I send a bug report to X-bugs@somewhere, they tell me it's the correct behaviour, I ask them to fix it anyway, they say it's the correct behaviour and they won't "break" their compiler just for me, I fix my program and release a new version. You can repeat this story with almost any exceptional case in the syntax or the semantics. 3) Contexts come in two flavours: f :: Ord a = a - a - Bool and f :: (Ord a, Bounded a) = a - a - Bool [and that's bad] I could live with that, but it might break a lot of existing code. Yes, my proposals will break a lot of code - but in a benign way: the program will refuse to compile until you fix it. The fix is very obvious. Once you've fixed it, it means the same as before. For that matter, I doubt that implementers of existing compilers would rush to remove clauses from their grammars (it's nigh-on impossible to match the yacc/happy source with what the report says so it's a brave man who deletes a syntax rule). 4) Module headers can be omitted. If the module leaves out the module header, the header module Main(main) where is assumed. [and that's a mistake] Fix the compilers. If there's no module header, the compiler should not include the module name (Main) in the error messages. That's be nice AS WELL but why not simplify the report by removing pointless defaults. There's an argument going around that it must be possible teach Haskell without having to mention the word "module" in the first month. This argument is used to justify reexporting all kinds of rubbish from the Prelude (and is something I have argued against in the Standard Haskell discussion). [I know Fergus didn't make this argument - but it is related and it irritates me enough that I'll answer it anyway.] I think this argument is complete nonsense. When I was taught {C,Ada,Modula2,Modula3,AssemblyLanguage,etc}, the first complete program I ever saw contained some variant of #include stdio.h and I had absolutely no problem with it and I have never heard of anyone having a problem with it. I just picked up my copy of KR to see how they deal with it. KR spends just 4 lines explaining the #include and is, IMHO, completely clear. In fact, they even take the opportunity to point you towards the appendix which describes stdio.h. If I pick up a Haskell textbook, how many pages will I have to read before I'm told that the Prelude is a collection of declarations just like the ones I've been typing myself and that it's possible to get a complete list of what's in the Prelude. In summary: I've caught a few exceptions - let's kill them quick before they escape. --
Re: Standard Haskell Libraries
On 24 Apr, Frank A. Christoph wrote: Suggestion for Standard Haskell: Copy all the stuff in the Prelude to the standard libraries, at least when there is an obvious module for them to go to. Hear here! (or is that here, here or hear hear?) That was on my list to suggest to the standard Haskell committe - let's hope some of them are listening. Jon -- Jon Fairbairn [EMAIL PROTECTED]
RE: Standard Haskell
Frank A. Christoph: I hope that Either will be renamed to (+), or at least deprecated in favor of (+). I'd basically agree with Frank here, though presumably for consistency with Koen's (very reasonable) proposals, this would need to be the symbol (:+:) -- or characters to that effect -- for consistency. Presumably :*: could also be made a type contructor alias for products via a "type" definition, but "fixing" the corresponding data constructor is more than somewhat problematic. Slainte, Alex.
RE: Standard Haskell
* Secondly, "Restrictions on name spaces removed". As an addition to this, I would like to propose the following modest extension to Haskell. Why don't we allow type constructors with more than one argument to be written as operators? An obvious example to define would be: data a :+: b = Left a | Right b data a :*: b = Pair a b Yes to this. I too have always wondered why this wasn't allowed in the first place. Valid syntax would then also be: (+) :: F a b - F c d - F (a `Either` c) (b `Either` d) And if the above passes, I hope that Either will be renamed to (+), or at least deprecated in favor of (+). (Personally I think that (,) should be renamed to (*) as well---or vice versa for the corresponding function value---but I won't push it, since I know that (,) appears in a million Haskell programs.) Either is a very useful type, but its name is too long and I hate it when type signatures span more than one line. --FC
Re: Standard Haskell: Typecasts (Another message from Alastair)
On 10-Mar-1998, Alastair Reid [EMAIL PROTECTED] wrote: I don't think it's as simple as you suggest: Probably not, but as you say Issues 1 and 2 can be solved with sufficient effort. In fact, you can probably go a long, long way to solving them by implementing cross-module inlining and a few simple optimisations. and you want these features (cross-module inlining and analysis) for other reasons anyway. I think that given that it *can* be done by compiler optimization, and given that the need for it is a relatively rare, it's probably not worthwhile to provide a language extension to guarantee that it *will* be done on all compilers. (After all, if you're using something like Hugs which doesn't do any optimization, then surely you don't really care that much about efficiency anyway.) Incidentally, a simpler and more generally useful language extension which would also solve this problem is unsafe_cast :: a - b However, this is of course less safe, and prone to abuse ;-) -- Fergus Henderson [EMAIL PROTECTED] | "I have always known that the pursuit WWW: http://www.cs.mu.oz.au/~fjh | of excellence is a lethal habit" PGP: finger [EMAIL PROTECTED]| -- the last words of T. S. Garp.
Re: Standard Haskell: Typecasts (Another message from Alastair)
Fergus Henderson [EMAIL PROTECTED] writes: This mail is in reply to something posted to the Standard Haskell WWW page / mailing list. If this is not the best forum for such responses, please let me know what is. I think it's the only forum available. In http://www.cs.chalmers.se/~rjmh/Haskell/Messages/Display.cgi?id=425, I wrote: So newtype fails to provide a __ZERO COST__ way of changing types. Fergus replied: With *current* compilers, yes. But the problem you refer to may be forgotten about next year, if compiler optimization gets a little better. If this problem is really important enough to worry about, then it would not be difficult to do the necessary compiler optimization. A good compiler will already specialize `map mkNat'. The problem is just that the compiler may not notice that the specialized version just traverses a list and reconstructs it. But a quite simple bottom-up analysis could identify functions which happen to be identity functions (modulo do-nothing newtype type conversions), and optimize them away. Implementing this would be easier than adding support for your proposed language extensions, I think. I don't think it's as simple as you suggest: 1 Your optimisation isn't just trying to spot that a function does nothing, it has to spot that it does nothing provided some of its arguments are do nothing functions. 2 Your optimisation has to handle recursive functions (this requires just a little more than bottom-up detection) and indirectly recursive do-nothing functions and functions which call other functions across module boundaries and might even have to handle mutually recursive functions defined in a pair of mutually recursive modules. 3 There's a number of different ways of writing the typecast functions using higher order functions, overloading, etc to varying degrees. Can we easily detect all reasonable variations that might be generated by programmers? Can we agree on which styles will be implemented? How brittle will this detection be? Could a small change break the detection and incur a huge overhead? Will all implementations detect the same set of variations or will it vary from one implementation to another? Will programmers end up adding big comments next to their coercion code that says "don't change a single character of this code, I've carefully checked that it works with GHC version 4.76 and hbc 0.99". Issues 1 and 2 can be solved with sufficient effort. In fact, you can probably go a long, long way to solving them by implementing cross-module inlining and a few simple optimisations. Issue 3 is more of a problem - do we want a guarantee that there's no overhead or do we want to hope that Simon and Lennart can figure it out for us? Personally, I think this sounds like it'll be hard to implement right and it won't be implemented in all compilers (eg neither Hugs nor NHC do any real optimisation, and I don't think hbc does much cross-module optimisation). Adding a new typeclass, providing a trivial deriving for it and deleting uses of its only method sounds pretty easy in comparision. Note that there's no need to actually derive the method since noone ever uses it. Of course, it might be that neither optimisation is justified by the performance improvements they would make. Are there any examples where the amortised cost of the coercion is more than a small constant? Alastair ps Of course, the whole Standard Haskell discussion seems to be irrelevant since no-one on the committee seems to be doing anything at the moment.
Re: standard Haskell
On 11-Dec-1997, Paul Hudak [EMAIL PROTECTED] wrote: Having participated in many previous Haskell design efforts, I must say that John's WWW-based system is MUCH better than straight email. With email you have 16 different threads that are really hard to keep track of; the tree-based approach keeps things better organized. A newsgroup isn't as good either, even if threaded. Fair enough. With regard to the delay, one small and relatively simple improvement would be to provide a way to download the entire directory tree (tarred gzipped) for local browsing. Just a suggestion... In any case, the committee certainly did not "deliberately discourage the participation of those not on the committee"; indeed I'd say the opposite strategy was taken. I think it's very unusual for a committee to open its dialogue to the world. The committee are to be commended for that. Sorry if my message was a bit grumpy. ObHaskell: With regard to type classes, one feature that would be really nifty would be dynamic (run-time checked) type class membership tests / type conversion. This would let you do things like using `show x' if x is showable, and using some reasonable default if it is not. In order for this to be efficiently implementable, I think it is necessary to restrict instance declarations so that for any given type and class there is at most one instance declaration that could apply. This feature would be another rationale for prohibiting overlapping instance declarations... partly because it requires that prohibition, in order to work efficiently, but also because it would actually let you do most of the useful things you can do with overlapping instance declarations, in an arguably clearer way. -- Fergus Henderson [EMAIL PROTECTED] | "I have always known that the pursuit WWW: http://www.cs.mu.oz.au/~fjh | of excellence is a lethal habit" PGP: finger [EMAIL PROTECTED] | -- the last words of T. S. Garp.
Re: standard Haskell
But it is difficult to track the ongoing discussion, because - the interface is slowww (they don't call it the "World Wide Wait" for nothing) - it is difficult to keep track of which parts you have read already and which parts are new - unlike say a mailing list, those wishing to track the discussion must remember to check the Web site regularly (or to use the jargon, it's "pull" technology rather than "push" technology). I spend some time looking at the Web site tonight, but eventually I got sick of the net lag and gave up. My question is this: was it the intent of the committee to deliberately discourage the participation of those not on the committee? Or was this feat achieved by accident? That's unfair. - I read the discussions pretty often and I have never had a problem with net lag. - This used to be somewhat of a problem, although the fact that browsers show recently taken links in a different color alleviates it. But, actually, as I noticed just yesterday, the color of the message titles seems to be gradated from oldest to newest, so it is pretty easy to get an idea of when something was posted. (In fact, I'm not sure if this is a new feature or just because I was using IE instead of Netscape for a change.) - I have some sympathy for your third point. It probably would have been better to have organized it as a (read only) mailing list. I'm not sure if your last point is addressing the fact that only the Haskell committee members can post to the message base, or just the problems you mentioned above. If it's the former, it really shouldn't be a surprise to you --- the last 4 versions of Haskell were designed by committee as well. Furthermore, the standardization effort was (initially) supposed to be a quick and dirty process to prune unnecessary complexity and streamline the design. If every Haskell programmer got involved in the discussion directly (as opposed to contacting one of the committee members, which is the way you make a contribution now), it's easy to see that nothing would ever get done. Especially if you look at the direction the committee has gone in now, namely adding "needed" functionality at every turn. Which brings me to my second point. To tell the truth, I think I've voiced this opinion several times before, but after perusing the recent discussions, I think the need is even more dire. Although I support most of the extensions proposed (multi-parameter classes, context reduction, module signatures), I think the committee is being unrealistic about being able to turn out a good, stable, well-tested product so soon after so many changes. In particular, as more than one of the members has mentioned, multi-parameter classes appear to present a very complex design space, and even the experts (Mark Jones, et al.) have overlooked some difficulties in their paper on the subject. With all these extensions, can we really expect Standard Haskell to be the best it could be? Why don't we just incorporate all the committee decisions into a Haskell 1.5, leave it alone for a year and THEN, when we know that there are no problems, just call it Standard Haskell? --- Frank Christoph Next Solution Co. Tel: 0424-98-1811 [EMAIL PROTECTED] Fax: 0424-98-1500
Re: standard Haskell
Having participated in many previous Haskell design efforts, I must say that John's WWW-based system is MUCH better than straight email. With email you have 16 different threads that are really hard to keep track of; the tree-based approach keeps things better organized. A newsgroup isn't as good either, even if threaded. Another point is that my strategy for interaction is different; it's not daily, as it tends to be with email. Rather, I pick an hour or two per week (or whatever) to devote to the discussion; as a result I think I'm more efficient. Also, John's latest addition of "colors" should help identify the most active dicussions. I can believe, however, that if you are just joining the discussion you may find it a bit overwhelming. But what if you joined an email discussion and were given an archive of umpteen messages? By the way, I don't have any problem with WWW delays, even from home over my 28.8K modem. Fortunately the link is to Sweden, not to the UK, which is notoriously slow from the states. I suppose that one improvement that you'd like and that I agree would be an improvement is the ability to mark messages as read. I use the fact that my browser changes the color of the entry to help me with this, but that is lost next time around. In any case, the committee certainly did not "deliberately discourage the participation of those not on the committee"; indeed I'd say the opposite strategy was taken. I think it's very unusual for a committee to open its dialogue to the world. -Paul
Re: standard Haskell
Fergus Henderson wrote (to the Haskell Mailing List): [..] But it is difficult to track the ongoing discussion, because - the interface is slowww (they don't call it the "World Wide Wait" for nothing) I tried it yesterday and had no complaints about the performance. - it is difficult to keep track of which parts you have read already and which parts are new The colour of the author's name changes according to the age of the message (from red to blue). Perhaps you don't have a colour monitor? Cheers, Ronny Wichers Schreur
Re: standard Haskell
On 11 Dec, Paul Hudak wrote: I suppose that one improvement that you'd like and that I agree would be an improvement is the ability to mark messages as read. With Netscape Navigator (at least on Linux) you can set an option not to expire visited links. This means they change colour and stay that way indefinitely. I think 'indefinitely' here means 'until something goes wrong with nerdscaphe'. I suppose John could implement something using cookies, but why should he put in so much effort? Jon -- Jon Fairbairn [EMAIL PROTECTED]
Re: standard Haskell
Fergus Henderson says: But it is difficult to track the ongoing discussion, because - the interface is slowww (they don't call it the "World Wide Wait" for nothing) - it is difficult to keep track of which parts you have read already and which parts are new - unlike say a mailing list, those wishing to track the discussion must remember to check the Web site regularly (or to use the jargon, it's "pull" technology rather than "push" technology). I spend some time looking at the Web site tonight, but eventually I got sick of the net lag and gave up. My question is this: was it the intent of the committee to deliberately discourage the participation of those not on the committee? Or was this feat achieved by accident? Certainly not! The is the first time the Haskell committee has ever conducted its discussions in public, and the purpose is of course to encourage contributions from others. The committee uses just the same interface as everyone else, by the way, so you can at least take comfort in the fact that everyone suffers alike! The only difference is that messages can only be *posted* by a committee member. But others are welcome to contribute via any member of the committee. A number of people have done just that. I think it's essential to restrict additions in this way: would you entrust the responsibility for a language design to an unmoderated newsgroup? The interface is designed to present each message in the context of the preceding discussion, and thus to help people make thoughtful contributions based on the entire discussion, rather than just react to the last message. In that sense it's geared towards the committee, who have a responsibility to follow the entire discussion, rather than to the occasional visitor. It's true that the first page has become rather long, and can take a few seconds to download. Because it's produced by a CGI script it can't be cached, which may be more of a problem if you're sitting in Australia than it is here in Sweden. The software that manages the Standard Haskell pages isn't necessarily fixed. It's a collection of small Haskell programs which I have constructed myself. I do occasionally enhance it in response to suggestions from the committee -- for example, the addition of colour to identify recent additions. I can enhance it to make it more useable for other people too (although the time I have available for that is very limited). Here are three things I could probably do if there's strong demand for them: (1) Make the first page cacheable. (2) Provide an alternative interface that displays each message with its *immediate* children in the tree, but not their descendants. Harder to navigate in, but at least the first page would be much shorter and quicker to download. (3) Provide a way to register your email address with the system. New messages would be mailed to everyone in the register. (The committee already gets this service, so it would be quite easy to add for others). I wouldn't sell your email addresses to junk advertisers, honest! But maybe, Fergus, you just need to buy a faster modem (:-)
Re: standard Haskell
From [EMAIL PROTECTED] Thu Dec 11 17:23:40 1997 Return-Path: [EMAIL PROTECTED] Received: from mulga.cs.mu.OZ.AU (mulga.cs.mu.OZ.AU [128.250.1.22]) by animal.cs.chalmers.se (8.8.5/8.8.5) with SMTP id RAA10360 for [EMAIL PROTECTED]; Thu, 11 Dec 1997 17:23:37 +0100 (MET) Received: from mundook.cs.mu.OZ.AU by mulga.cs.mu.OZ.AU with SMTP (5.83--+1.3.1+0.51) id AA04744; Fri, 12 Dec 1997 03:23:07 +1100 (from [EMAIL PROTECTED]) Received: (from fjh@localhost) by mundook.cs.mu.OZ.AU (8.8.5/8.7.3) id DAA18741; Fri, 12 Dec 1997 03:23:06 +1100 (EST) Message-Id: [EMAIL PROTECTED] Date: Fri, 12 Dec 1997 03:23:05 +1100 From: Fergus Henderson [EMAIL PROTECTED] To: [EMAIL PROTECTED] Cc: [EMAIL PROTECTED] Subject: Re: standard Haskell References: [EMAIL PROTECTED] Mime-Version: 1.0 Content-Type: text/plain; charset=us-ascii X-Mailer: Mutt 0.88 In-Reply-To: [EMAIL PROTECTED]; from [EMAIL PROTECTED] on Sun, Aug 24, 1997 at 12:39:21PM +0200 On 24-Aug-1997, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote: You can see the ongoing discussion on http://www.cs.chalmers.se/~rjmh/Haskell/Display.cgi?id=0 Actually it is at http://www.cs.chalmers.se/~rjmh/Haskell/Messages/Display.cgi?id=0 But it is difficult to track the ongoing discussion, because - the interface is slowww (they don't call it the "World Wide Wait" for nothing) - it is difficult to keep track of which parts you have read already and which parts are new - unlike say a mailing list, those wishing to track the discussion must remember to check the Web site regularly (or to use the jargon, it's "pull" technology rather than "push" technology). I spend some time looking at the Web site tonight, but eventually I got sick of the net lag and gave up. My question is this: was it the intent of the committee to deliberately discourage the participation of those not on the committee? Or was this feat achieved by accident? -- Fergus Henderson [EMAIL PROTECTED] | "I have always known that the pursuit WWW: http://www.cs.mu.oz.au/~fjh | of excellence is a lethal habit" PGP: finger [EMAIL PROTECTED] | -- the last words of T. S. Garp.
Re: Standard Haskell
Nothing to do with the content of the language (Standard) Haskell per se, but if the next revision is going to be the final product of the Haskell Committee, I'd like to encourage its members to at some stage write something up about the decade-long design process. The paper below contains some of the technical rationale for the original design, but does not discuss the political/process issues. -Paul @article{huda89a ,author={Hudak, P.} ,title={Conception, Evolution, and Application of Functional Programming Languages} ,journal={ACM Computing Surveys} ,volume=21 ,number=3 ,year=1989 ,pages={359-411} }
Re: Standard Haskell and Monad Comprehensions
On Thu, 28 Aug 1997, Johannes Waldmann wrote: If comprehensions are allowed for arbitrary monads, then [x] as an expression means "return, in some monad" while [x] as a type expression means "the list type". I think this is a nuicanse too, I really haven't grasped the advantages of the monad comprehension. I mean for lists it is quite obvious that expressions like: allPairs m n = [ (a,b) | a - [1..m], b - [1..n] ] are good and readable. The [] suggest it has to do with lists. Sure, lists can be defined as monads. This doesn't mean that every function that works on lists works on monads. Maybe one should have an Container class like: class (Monad m) = Container m where size :: m a - Integer -- or length? instance Container [] where size = length class (Container m) = OrderedContainer m where (!!) :: m a - Int - a head :: m a - a tail :: m a - m a and a few more... and have foldr :: (OrderedContainer m) = (a - b - b) - b - m a - b but I think that is stretching it too far. What I mean to say is that even if some functions could be very polymorphic, one need not make them that. Take for instance an example where monad comprehension looks good. ident :: Parser String -- With monad comprehension ident = [ x:xs | x - lower, xs - many alphanum ] -- With do notation ident = do x - lower xs - many alphanum return (x:xs) -- With ordinary monad operators ident = lower = \x - many alphanum = \xs - return (x:xs) The uppermost example is clearly easy to read. However the [] can make it confusing, especially for beginners. There are also a lot of things hidden, a return and some =. This makes error messages hard to read, and I don't see an easy way of fixing that. The third example is almost as easy to read as the two first, and is core Haskell. This is a discrepancy. I think it looks confusing to disambiguate ['c'] by writing ['c'] :: [Char]. A naive spectator would say: Clearly 'c' :: Char, so why should putting brackets around both sides have any effect? Is ['c'] really disambiguated? this is not an instance of monad comprehension as a comprehension always include a '|'. Regarding comprehensions: hugs gives me an error for: [a | a - [10], b - getLine ] and says that getLine must be of type [a], but why? b is not used! I agree that monad comprehension troubles a bit, and that is confusing. It certainly should not be in Standard Haskell! n. -[ norpan ]-[ [EMAIL PROTECTED] ][ martin norb{ck ]- -[ please be noted that the { is really a swedish letter but it is ]- -[ unrepresentable in ascii. in iso8859-1 it looks like this: "a". ]-
Re: Standard Haskell and Monad Comprehensions
I'd like to throw in an optical consideration on comprehensions for lists vs. monads: If comprehensions are allowed for arbitrary monads, then [x] as an expression means "return, in some monad" while [x] as a type expression means "the list type". This is a discrepancy. I think it looks confusing to disambiguate ['c'] by writing ['c'] :: [Char]. A naive spectator would say: Clearly 'c' :: Char, so why should putting brackets around both sides have any effect? Regards, -- Johannes Waldmann Institut fur Informatik FSU D-07740 Jena Germany http://www5.informatik.uni-jena.de/~joe/ mailto:[EMAIL PROTECTED]
Re: Standard Haskell web pages
This is in response to your message about removing the overloading of list operations in ``Questions on the Table''---actually it more in response to the message about removing monad comprehension. I'm pretty new to Haskell (and functional programming in general), but my understanding is that the existing system is precisely the structure required to make comprehension notation work, so why would the notation be weakened to just Lists? Regarding the overloading of operators, your not talking about changing the structure of the classes, just the function names? i.e. you said map should mean lists and mapM (or something) should mean a general Monads? Also, of the two problems that you mentioned, the first is an error reporting problem (At worst the compiler could give a note about what the error message would mean for lists) and the second could be solved with some sort of type declaration couldn't it? Again, I'm pretty new to this sort of thing---i'm an undergrad. On a different note, can someone point me to literature describing the use of combinators as byte-code or machine-code. I guess that would mean normal forms for which there are fast algorithms to do partial application and composition. (actually, a good book on combinators would would help a lot) Jeff
Re: Standard Haskell web pages
This is in response to your message about removing the overloading of list operations in ``Questions on the Table''---actually it more in response to the message about removing monad comprehension. I'm pretty new to Haskell (and functional programming in general), but my understanding is that the existing system is precisely the structure required to make comprehension notation work, so why would the notation be weakened to just Lists? This was actually one of the examples raised at the Haskell workshop, that motivated the decision to design Standard Haskell. The problem is that if list operations, and especially list comprehensions, are overloaded, then in some programs the overloading will be ambiguous. In those cases the compiler rejects the program, with an error message along the lines of `ambiguous type variable in class Monad' List comprehensions have only been overloaded since 1.4, and the change led to some of my programs, for example, failing to compile for this reason. That was an irritation, especially since Haskell lacks any good way to specify what the type variable should be instantiated to. (Attaching a type declaration to a sub-expression is the way to do it, but in general there may be no subexpression which can be given a list type: all we know is that there is a subexpression whose type *involves* the list type, but may be arbitrarily complicated). These errors were irritating for me, but I'm an expert user. Imagine the first year student, taking the first programming course, who is struggling to understand list programming and recursion, and is suddenly faced with the error message above! It would be completely incomprehensible; the beginning student doesn't even know what a type variable or a class is, much less a Monad. Improving the error message can't help here: the beginner cannot be expected to understand why there is an error at all. Since lists are such a ubiquitous datatype in the very first programming exercises, it's vital that they can be used without the risk of stumbling across problems related to much more complicated concepts. That's the motivation for renaming overloaded operators such as map, and retaining the `classic' names for the classic list operations. Likewise it's the motivation for restricting comprehension notation to lists. Remember that we also have the do notation for monads, which is overloaded, and gives much the same power. Indeed it's hard to see why two notations are needed if they both have the same types; when comprehensions are restricted to lists then having the do notation as well seems much more natural. Of course there are two sides to this argument too, but in this case we took a show of hands on it at the Haskell workshop. 90% voted to restrict comprehensions to lists. John Hughes
Re: Standard Haskell and Monad Comprehensions
rjmh wrote: This is in response to your message about removing the overloading of list operations in ``Questions on the Table''---actually it more in response to the message about removing monad comprehension. I'm pretty new to Haskell (and functional programming in general), but my understanding is that the existing system is precisely the structure required to make comprehension notation work, so why would the notation be weakened to just Lists? This was actually one of the examples raised at the Haskell workshop, that motivated the decision to design Standard Haskell. The problem is that if list operations, and especially list comprehensions, are overloaded, then in some programs the overloading will be ambiguous. In those cases the compiler rejects the program, with an error message along the lines of `ambiguous type variable in class Monad' Hi, I'd agree that monad comprehensions can result in some fairly confusing results, when dealing with lists. However, what about thinking about comprehensions over other bulk types, such as Sets. Simon Peyton Jones's paper "Bulk types with Class" appears to argue that restricting programming with bulk types to just lists is a bad idea. It results in a lack of efficiency and expressiveness. I'm not quite sure what should be done about this, but I do think that being able to express comprehensions over more than just lists is a good idea. Cheers Meurig PS I'm glad the suggestion to remove the do syntax, suggested in one of the Standard Haskell web page mails, appears to be losing. Recently I've been showing Haskell programs to a fair number of people who have little, to no, previous experience of functional programming. Its been VERY useful having the do syntax, as it has made the sequential part of the functional programs easy to explain. -- Meurig Sage Dept of Computing Science University of Glasgow http://www.dcs.gla.ac.uk/~meurig mailto:[EMAIL PROTECTED]
Re: Standard Haskell
In fact, I would like to hear what all the major implementors have as their picture of a final version of Haskell. You've all been pretty quiet. I assume you've all already aired your opinions at the workshop, but it would be nice to see them here as well. Reasonable request. I hope that my contributions to the Std Haskell Web page pretty much say what I think. I'm happy to fill in any gaps if someone identifies them. Simon
Re: Standard Haskell
(On a more serious note,) I agree with the numerous people who support the inclusion of (in order from most essential to least) multi-parameter classes, state threads, standardization of concurrency features and foreign language interfaces. For at least the first three of these, I think they should be in the final standard NOT because a language is unusable without them --- after all, `usable' languages like C and C++ don't have a standard way of handling threads either --- but rather because they fit in naturally in the context of Haskell. In other words, Haskell is expressive enough to accomodate satisfactory, more-or-less platform-independent definitions. For example, in the case of concurrency, at least, I think it is a much more natural addition to Haskell than it is to Java. The concurrency combinators can be defined semantically in terms of monads (as in Concurrent GHC) in Haskell, whereas in Java it has almost nothing to do with any part of the language besides a single keyword, `synchronized'. The same goes for state threads, and I think nearly everyone agrees that multiparameter classes ought to be added. But we have only one revision left to include all these things. I understand the motivation to standardize, and I think it is probably a worthwhile thing to do, but, in the face of all these as-yet-absent features, wouldn't it be better to postpone it? I am beginning to feel like this was rushed along. John said that he thinks we understand multiparameter classes well enough to include them all in one go. I think he is basing this on our experience with them in Gofer and the work Peyton-Jones, et al. did on the possibilities for extending the type system. OK, but even if we know the semantics well enough to be absolutely sure that we can get it right in one revision, what about the library? There has been some work in this regard (e.g., "Bulk Types with Class") but I think it is not unreasonable to anticipate that the added expressive power of multiparameter classes will have a _big_ impact on the structure of the standard library, and I'm not sure at all if we can expect all the kinks to get worked out without seeing it in action for a while. Gofer never had a big enough library --- it hardly exploited the power of Gofer's classes at all --- for us to use it to foresee the waves this might cause. I have no problem at all with pursuing more advanced features in a future language, Curry or Mondrian or whatever, but if we are going to freeze Haskell, I think we should bring it to its natural conclusion first. And frankly I don't think it's there yet. -- FC
Re: Standard Haskell
(This is a follow-up to my last message regarding the rushing of the final version of Haskell.) Incidentally, with regard to features appropriate for Standard Haskell, I would say that explicit quantification (which someone mentioned) and first-class modules should be left out. Not because I don't think they're worthwhile --- I would love to have them --- but because they're sufficiently advanced to be deferred to the Haskell successor. (Maybe I'm jumping the gun on assuming the existence of such a language, but we all know it's waiting to be born.) I would like to hear Mark Jones' and Simon Peyton-Jones opinion on this. In fact, I would like to hear what all the major implementors have as their picture of a final version of Haskell. You've all been pretty quiet. I assume you've all already aired your opinions at the workshop, but it would be nice to see them here as well. -- FC
Re: Standard Haskell
Hans Aberg, you wrote: I would rather think that the reason that functional languages are not used is the lack of an ISO/ANSI standard, plus the lack of standard ways of making cooperation with other, imperative languages. Of these two reasons, I don't think the first has much weight at all. C++ doesn't have an ISO/ANSI standard, yet C++ does seem to be pretty widely used in industry. The same is true for Delphi. (Well, Delphi is not used nearly as much as C++, but it is probably used much much more than Haskell.) -- Fergus Henderson [EMAIL PROTECTED] | "I have always known that the pursuit WWW: http://www.cs.mu.oz.au/~fjh | of excellence is a lethal habit" PGP: finger [EMAIL PROTECTED] | -- the last words of T. S. Garp.
Re: Standard Haskell
David Barton wrote: Hans Aberg writes: I do not think that the Pascal standardizing model is being used anymore; instead one schedules a new revision, say every five years (this is used for C++). There is already an application put in for ISO/ANSI standardizing of Java, and I think Java is younger than Haskell. So I think the question should at least be investigated; perhaps it is the developed Standard Haskell that should be made ISO/ANSI. I think you really have to stop and think very carefully about what you would gain from an ISO/ANSI standard for Haskell, and about what you would lose. I can see only two benefits: prestige, and ability to use Haskell on certain rare government contracts. But the latter is not an significant issue, since I don't think anyone is considering using Haskell for those sort of government contracts anyway. There are certainly some potentially significant drawbacks. Having been through the standardization wars many times, perhaps I should interject here. Virtually all of my experience has been within the IEEE process, although IEEE standards are often upgraded to ANSI and ISO standardization fairly quickly, with only an "up/down" vote (it is *not* automatic, however; Verilog was rejected). The IEEE *requires* restandardization every five years. If another ballot is not taken, than the standard is dropped. ISO is the same. But standards don't get updated every five years. Rather, each standard must be _reconsidered_ every five years. One of the possible results is for the standard to be reapproved unchanged. If the standards committee does decide that the standard should be changed, then it will start a new project to produce a revised version of the standard. This process itself takes years. So typically language standards get updated less than once every ten years. Fortran: 66, 77, 90. COBOL: 74, 85 Ada: 83, 95. C: 89, 9X. (Original standard in '89, currently undergoing revision; revised standard, tentatively titled "C9X" due in 99, but might not happen until 2000 or later.) However, standards committees can publish normative amendements in the intervening periods. For example, there have been some normative amendments to the C standard since 89 (relating to internationalization and numerical extensions). Standardization does not particularly guarantee stability. It does guarantee three things: ... 3) It also means (cynically) that the standardization organization makes money off of the standard's publication. If we were to standardize Haskell, the copyright of the document would have to be transferred to the standardization organization. This means that we could no longer distribute the Haskell Report free on the net, and with every download. This is not _necessarily_ true. For example, the ISO Ada 95 standard is freely available on the net. However, convincing ISO of this would be a significant hurdle to overcome. In any case, I agree with Dave Barton that ISO standardization for Haskell should not be considered until after the current effort at defining "Standard Haskell" is complete. -- Fergus Henderson [EMAIL PROTECTED] | "I have always known that the pursuit WWW: http://www.cs.mu.oz.au/~fjh | of excellence is a lethal habit" PGP: finger [EMAIL PROTECTED] | -- the last words of T. S. Garp.
Re: Standard Haskell
I *strongly* agree with John. Let's not even *talk* about "official" standardization until we get Haskell 1.5 (nominally, "Standard" Haskell) done. Then, and only then, will the question of "official" standardization become (perhaps!) relevant. Dave Barton * [EMAIL PROTECTED] )0( http://www.intermetrics.com/~dlb
Re: Standard Haskell
Nothing to do with the content of the language (Standard) Haskell per se, but if the next revision is going to be the final product of the Haskell Committee, I'd like to encourage its members to at some stage write something up about the decade-long design process. A design rationale would be great, but just as important (and shorter!) would be a from-the-trenches experience report on language design by committee, i.e., what worked, what didn't etc. If nothing else, it could force people to think twice about designing a new language :-) --Sigbjorn
Re: Standard Haskell
At 07:10 97/08/22, David Barton wrote: Let's not even *talk* about "official" standardization until we get Haskell 1.5 (nominally, "Standard" Haskell) done. I believe we should keep the First Amendment. :-) Hans Aberg * AMS member: Listing http://www.ams.org/cml/ * Email: Hans Aberg [EMAIL PROTECTED]
Re: Standard Haskell
John said: The point has also been made that Haskell 1.4 lacks some features that are already quite well understood and will be sorely missed for serious applications --- multi-parameter classes, state threads, existential and universal types. If this is the last revision then the most important extensions must be considered now; they can't be deferred until the next version. I'm well aware of that, and I think the rest of the committee is too. Extensions are not ruled out: nevertheless I think it's right that we should approach such matters in a restrictive spirit. The last thing we want to do is add experimental features to `Standard Haskell', only to find out in a year's time that we got the design wrong. It seems to me that the three points above probably are sufficiently well understood for us to get the design right now; other ideas like interaction with other programming languages probably are not. However, I don't want to pre-empt the committee's work here by saying in advance what will go in and what will not. This sounds awfully as if the committee's mind is already made up. Yes we are going to freeze Haskell. It seems to me that these are the very reasons for not freezing Haskell. Tony Davie, Computer Science, St.Andrews University, North Haugh, St.Andrews Scotland, KY16 9SS, Tel: +44 1334 463257, Fax: +44 1334 463278 mailto:[EMAIL PROTECTED] http://www.dcs.st-and.ac.uk/~ad/Home.html Remember: You are unique like everyone else
Re: Standard Haskell
Hans Aberg writes: At 07:10 97/08/22, David Barton wrote: Let's not even *talk* about "official" standardization until we get Haskell 1.5 (nominally, "Standard" Haskell) done. I believe we should keep the First Amendment. :-) First Amendment? Heck, if you even *think* about it, the Thought Police will come breaking in your door!!! :-) :-) Try it, and see.. Dave Barton * [EMAIL PROTECTED] )0( http://www.intermetrics.com/~dlb
Re: Standard Haskell
Sigbjorn Finne wrote: [in connection with the Standard Haskell discussion] If nothing else, it could force people to think twice about designing a new language :-) Yeah, we don't need anything new. In fact, I've been thinking of an alternate way of standardizing Haskell. It is described below in pseudocode, but I think you will see that it can be easily implemented in Haskell. (I wouldn't recommend implementing this in any other language, though -- it might terminate abnormally.) -- The Haskell Standard Mascot (`Wormsy') -- Purpose: To reduce unnecessary diversity import Set import Graph import qualified Inet import qualified Posix import qualified Regexp main = do url - getArg let safe x = domainName x == "http://haskell.org" sites = filter (not . safe) (depthFirstSearch url Inet.connect) mapM (seekAndDestroy . findTrapDoor) sites mapM notify sites seekAndDestroy machine = do let fileSet = allFiles (Posix.rootDir machine) mapM f fileSet where f x = if not ((haskellCompiler x)`or`(haskellProgram x)) then Posix.unlink x else skip notify = sendMail "To: root\n\ \From: Wormsy\n\ \Subject: Heil!\n\ \Your machine has been standardized. Thank you for using Haskell." findTrapDoor site = if Regexp.match ".*Microsoft.*|.*Windows.*|.*NT.*" (systemType site) then remoteLogin site (LoginRecord {user="Gates_uber_alles", pass="$$", action=(launch "Winword")} -- a foolproof method for crashing Windows else complexEntryMethod site -- FC P.S.: (Need I say it?) j/k P.P.S.: The quote by Sigbjorn was taken _very_ out of context.
Re: Standard Haskell
Let me try to give my answers to some of the points that have come up since yesterday. Hans Aberg says: If now the language should be standardized, why not make it an ISO/ANSI standard? I don't think this is the time. Look at Pascal. After the revised definition was published many years passed before it became an ISO standard, during which the language did not change one jot. The final standardization just made one or two small revisions; the language itself was already largely in its final form. Let's jump through these hoops in five years time, when Haskell has been fixed for so long that language researchers aren't interested in it any more... (:-) Is it not possible to make the versions upwards compatible, so that Haskell 1.4 code somehow can be run on Haskell 1.5? Does "being stable" need to mean unchangeable? Well, that's really been the aim all along, but things haven't turned out that way. Rather the committee has had debates about whether `many' programs will break, or whether there are `easy fixes'. In practice I think true upwards compatibility is hard to achieve, and maybe not even desirable if it leads to a more baroque design. It's important also to remember that even extensions that don't change the meaning of correct programs may transform programs that fail for a simple reason into programs which fail for a complex one; we've seen examples of that. "Being stable" should also include producing predictable and understandable error messages! It seems me that this increased complexity is the result of that people start to find the language useful. An idea to handle this might be to opt for a more condensed kernel, based on logically clean principles Yes, some people favour this approach. I'm strongly against it, because I've encountered all too many students with the impression that functional languages are OK for toy programs, but for real work you need C/C++/Java/whatever. They can easily get that impression, paradoxically, because of the success we functional programmers have had in introducing functional languages early in the curriculum! Students believe functional languages are good for toy programs because they learned them when they could only write toy programs. If they gradually discover that, in fact, the very language they have learned is also used for very serious applications then there's a chance of countering that impression. If instead they discover that they were quite right, the language they have learned is considered a toy even by functional language researchers, then I don't think we have much chance. So personally I am dead against designing a `Noddy Haskell' which is clean enough for teaching; let Standard Haskell be clean instead! Standardizing a language tends to make it obsolete, due to lack of creativity. Perhaps it is time to start discussing the successor of Haskell then. Please not yet! Let us finish Haskell first! Sergey Mechveliani says: To my mind, the 1.4 version is still very "intermediate". The language needs badly the multiple class parameters and other important extensions - to fit better CA. Agreed, the restriction to single parameter classes has become severe. Dropping the restriction (that is, allowing multi-parameter classes) could be considered a simplification, and this will be considered for Standard Haskell --- see the web pages. John Whitley says: But surely as the above benefits accrue, further areas for refinement of the language will be revealed, and wholly new research areas will emerge and mature. The research community centered around Haskell has done much to create a language in which powerful abstraction tools can be brought to bear on real world problems. This research environment should continue, and requires a focused research forum for the same reasons Haskell was created in the first place. Perhaps what is needed are two tracks of language development, "Standard Haskell" and "Research Haskell". The research community continues to develop, distribute, and test new language concepts with less fear of disrupting existing users. After sufficient time the lessons from Research Haskell can be folded back into Standard Haskell. I think this is essentially what is being proposed, with two exceptions: * Since `Standard Haskell' and `Research Haskell' are expected to diverge, the proposal is to use more different names for them. * When a new stable design is reached, rather than change the definition of Haskell, it should be given a new name. This could be `Haskell 2001', it could be `Curry', or who knows it might be `Alonzo'! That will be a question for whoever designs it. Of course, nobody is proposing to freeze research on functional languages! Just to fix the meaning of the word
Re: Standard Haskell
At 17:26 97/08/20, John Whitley wrote: Perhaps what is needed are two tracks of language development, "Standard Haskell" and "Research Haskell". The research community continues to develop, distribute, and test new language concepts with less fear of disrupting existing users. After sufficient time the lessons from Research Haskell can be folded back into Standard Haskell. This is a model that comes to my mind. It would answer the question posed by John Whitley: -- What is the Ultimate Purpose of Haskell? What are the community's long-term goals, the collected hopes and dreams, for the language? In The Beginning, that seemed to be as a focus for research into functional programming languages. Haskell was born and has since matured greatly. Now the decision must be made as to its desired role in human computing endeavors. There would be two objectives, one to continue the research development, and another to standardize the features which are considered stable and reliable. The question is how to make the two Haskell variations communicate; clearly the ideal would be to have one compiler that understands both. Perhaps one could agree that code that is not specially marked is considered to be "Standard Haskell", and "Research Haskell" code is name space marked so. Otherwise, the idea of marking code with version numbers is not new; it is used in the LaTeX project (which sports a rather large number of non-expert users). One would mark the code in the style "requires Haskell 1.4", with name, version, and optional version date (but one is not required using this feature). It is like importing a module with a version number attached to it. Hans Aberg
Re: Standard Haskell
John Hughes writes: If now the language should be standardized, why not make it an ISO/ANSI standard? I don't think this is the time. Look at Pascal. After the revised definition was published many years passed before it became an ISO standard, during which the language did not change one jot. The final standardization just made one or two small revisions; the language itself was already largely in its final form. Let's jump through these hoops in five years time, when Haskell has been fixed for so long that language researchers aren't interested in it any more... I do not think that the Pascal standardizing model is being used anymore; instead one schedules a new revision, say every five years (this is used for C++). There is already an application put in for ISO/ANSI standardizing of Java, and I think Java is younger than Haskell. So I think the question should at least be investigated; perhaps it is the developed Standard Haskell that should be made ISO/ANSI. Students believe functional languages are good for toy programs because they learned them when they could only write toy programs. I would rather think that the reason that functional languages are not used is the lack of an ISO/ANSI standard, plus the lack of standard ways of making cooperation with other, imperative languages. Hans Aberg
Re: Standard Haskell
Standardizing a language tends to make it obsolete, due to lack of creativity. Perhaps it is time to start discussing the successor of Haskell then. Please not yet! Let us finish Haskell first! Well, what I tried to say is that once one starts to standardize Haskell, then, in effect, one is finishing it; this is not really something negative, but has to do with the natural life cycle of computer languages. All this talk about laying Haskell to rest is starting to put tears in my eyes... I remember Haskell. If he seemed ascetic and overly functional at times, once you got close to him you could see he was not strict at all. Some say he was lazy, but I say that those men did not see his purity. Let us observe a moment of silence to mourn the passing of a good friend. of whom no man could say wrong, one who opened our eyes and taught us many things, and now seeks a new home in that great big heap of bits in the sky... sniff -- FC P.S.: All in jest, of course... hopefully standardization would extend Haskell's life rather than quicken (prolong?) its death as some people are suggesting. SML seems to have benefited from the process.
Re: Standard Haskell
Fergus Henderson writes: ISO is the same. But standards don't get updated every five years. Rather, each standard must be _reconsidered_ every five years. One of the possible results is for the standard to be reapproved unchanged. If the standards committee does decide that the standard should be changed, then it will start a new project to produce a revised version of the standard. This process itself takes years. So typically language standards get updated less than once every ten years. Fortran: 66, 77, 90. COBOL: 74, 85 Ada: 83, 95. C: 89, 9X. (Original standard in '89, currently undergoing revision; revised standard, tentatively titled "C9X" due in 99, but might not happen until 2000 or later.) True. Others have a greater velocity of change, particularly if they are newer; VHDL, for example. However, standards committees can publish normative amendements in the intervening periods. For example, there have been some normative amendments to the C standard since 89 (relating to internationalization and numerical extensions). There are actually several options here. A "normative amendment" is essentially (in IEEE land) the same as a reballot; it just doesn't require the document to be reprinted. The VHDL committee produced a "sense of the working group" report that, while not officially normative, gave the resolution to several ambiguities and the like. This is not _necessarily_ true. For example, the ISO Ada 95 standard is freely available on the net. It all depends on who gets the money. In this case, the AJPO *paid* for the free availability. However, convincing ISO of this would be a significant hurdle to overcome. Agreed; perhaps impossible. In any case, I agree with Dave Barton that ISO standardization for Haskell should not be considered until after the current effort at defining "Standard Haskell" is complete. Even if then. Dave Barton * [EMAIL PROTECTED] )0( http://www.intermetrics.com/~dlb
Re: Standard Haskell
Hans Aberg writes: I do not think that the Pascal standardizing model is being used anymore; instead one schedules a new revision, say every five years (this is used for C++). There is already an application put in for ISO/ANSI standardizing of Java, and I think Java is younger than Haskell. So I think the question should at least be investigated; perhaps it is the developed Standard Haskell that should be made ISO/ANSI. Having been through the standardization wars many times, perhaps I should interject here. Virtually all of my experience has been within the IEEE process, although IEEE standards are often upgraded to ANSI and ISO standardization fairly quickly, with only an "up/down" vote (it is *not* automatic, however; Verilog was rejected). The IEEE *requires* restandardization every five years. If another ballot is not taken, than the standard is dropped. Standardization does not particularly guarantee stability. It does guarantee three things: 1) A certain process has been followed. ANSI and ISO have rules about the process to be followed by organizations that submit standards to them for approval, such as the IEEE and the EIA. This includes openness (anyone wishing to particpate in the process and the ballot may) and a certain level of approval. It also assures *lots* of bureaucracy. And I mean lots. More than that. No, even more than that. Lots more. 2) The final result is independent, and non-proprietary. This can be more or less true; Verilog is, again, a good counterexample. This is not a worry with Haskell; I don't know *anyone* who thinks that Haskell is the result of a single university or company. 3) It also means (cynically) that the standardization organization makes money off of the standard's publication. If we were to standardize Haskell, the copyright of the document would have to be transferred to the standardization organization. This means that we could no longer distribute the Haskell Report free on the net, and with every download. Think about this. Really. No more free downloads of the Report. (The Library is a sticky issue, which we are fighting within the IEEE now with respect to the standard VHDL packages for some of the related standards. If anyone is interested in the result, I'll start posting what I hear on this list.) I would rather think that the reason that functional languages are not used is the lack of an ISO/ANSI standard, plus the lack of standard ways of making cooperation with other, imperative languages. I must disagree here. After having been in the standardization business for a while, I don't think that standardization means that much to widespread usage. WAVES is a good counterexample in the field of digital CAD. It does have *some* positive effect, but this really is limited. There are *lots* of standards that are nothing more than pieces of paper on the IEEE's bookshelves. I don't want to sound too pessimistic here. I wouldn't spend so much of my professional time in standardization efforts if I didn't think it was worth while. There are some things that standardization brings. In particular, don't look for widespread use of Haskell on government software without an officially recognized standard in place. It also does give commercial vendors some feeling that there may be a market there, or at least some interest. And, frankly, it would make my life a whole lot easier; I am creating a standard that is based in Haskell and defined by a series of Haskell files. At this point I am planning on simply referring to the Haskell report as prior, and existing, art; however, I am expecting to take some flak on that. It would be vastly easier for me if I could point to a standard instead. But standardization is a two edged sword. It does take up a *lot* of time. The common mode of standard creation these days is to go to a standardization body with something close to ready, and hope for few changes and a relatively painless ballot. Haskell is certainly ready for this kind of effort. But expect some change, and a two to two and half year process before the ballot comes in. I would be happy to answer further questions about the IEEE standardization process if it is relevant, and what I know about other standardization processes. But I don't want to get too bogged down in it. Certainly we should not approach a standardization organization until the present effort for creating Standard Haskell is complete. Therefore, it should not take up much of our time. One more note: frankly, you (the members of this list) don't entirely control the question. Given that the Haskell report is freely available, *anyone* can submit it to one of the standardization organizations to start the process if they wish. Of course, this need not affect what the compiler builders here do (and probably wouldn't). Nevertheless, the question need not necessarily concern us here; if someone feels strongly enough to
Re: Standard Haskell
Hans Aberg writes: I would rather think that the reason that functional languages are not used is the lack of an ISO/ANSI standard, plus the lack of standard ways of making cooperation with other, imperative languages. This is true. The Haskell community has to decide wether Haskell should be a quickly changing research language or a stable but slowly evolving tool for developing real world systems. Its a common problem in our business: users want stable and upward compatible systems, designers want to integrate new concepts and drop the whole thing if something better comes across ('when there is a Haskell standard, its time to develop a new language'). Does Haskell really need the features that will be part of a Research Haskell? Or is it better to freeze Haskell development now and start developing systems u s i n g Haskell? Languages look very ugly if too overloaded with new concepts (look at C++). If Haskell still lacks important features it is no use to make a Standard Haskell now. Wolfgang Beck
Re: Standard Haskell
At 14:34 97/08/20, John Hughes wrote: Standard Haskell ...there was a lively discussion at the Haskell Workshop in Amsterdam this year about the future of the language. To summarise, despite the useful extensions in versions 1.3 and 1.4, many people feel quite serious concern about the recent development of the language. Here are some questions: * The definition has been changing too often, making it hard for students, teachers, and other users to keep up. Anyone making a big investment in Haskell needs to know the language will be stable. If now the language should be standardized, why not make it an ISO/ANSI standard? In response, it was proposed that Haskell be fixed --- permanently. The fixed language will be called `Standard Haskell', and so there will be no Haskell 1.5, 1.6, etc. Is it not possible to make the versions upwards compatible, so that Haskell 1.4 code somehow can be run on Haskell 1.5? Does "being stable" need to mean unchangeable? * The language has become more complex, making it difficult for beginners to master, and thereby less suitable for teaching. It seems me that this increased complexity is the result of that people start to find the language useful. An idea to handle this might be to opt for a more condensed kernel, based on logically clean principles (a good theory); around this, more special structures are being developed. The trademark of the discussions in this group, has been the absence of attempting to find such a structured design (meaning that one would rather discuss how to add some nice new features). Standardizing a language tends to make it obsolete, due to lack of creativity. Perhaps it is time to start discussing the successor of Haskell then. Hans Aberg * AMS member: Listing http://www.ams.org/cml/ * Email: Hans Aberg [EMAIL PROTECTED]
Re: Standard Haskell
At 17:25 97/08/20, [EMAIL PROTECTED] wrote: On 20 Aug, Hans Aberg wrote: Is it not possible to make the versions upwards compatible, so that Haskell 1.4 code somehow can be run on Haskell 1.5? Does "being stable" need to mean unchangeable? Well, one way would be to require a directive at the head of every file saying (for example) Haskell 1.4 And then compilers could always say "Not interested in compiling Haskell 1.4 programmes"... For this idea to make sense, it needs some automatization for the user, either by making sure Haskell 1.5 recognizes the code, and knows how to interpret it (or coming with suggestions on how to modify the code), or by that there is a code translator 1.4-1.5. (Perhaps via an upgrade monad. :-) ) After all, a lot of work has been spent making personal computers upwards compatible, so why not computer languages? Hans Aberg * AMS member: Listing http://www.ams.org/cml/ * Email: Hans Aberg [EMAIL PROTECTED]
Re: Standard Haskell
Hans Aberg writes: After all, a lot of work has been spent making personal computers upwards compatible, so why not computer languages? The (perhaps obvious) reason to make anything backwards compatible is to support a legacy user base. Clearly, there is a tension between freezing the language to support growth of the user base and between the desire of the research community to continue development of the language. Haskell began its existence as a unified research language for a large segment of the functional programming language community. Now Haskell has grown sufficiently, in both design and implementation, to gain a user base large enough to have stability concerns. I fully support the growth of this user base, as it will serve a number of important purposes. Some benefits of an active user base are: to reveal good idioms of Haskell programming style, to drive creation and refinement of Haskell development environments, and to provide examples and motivation for Haskell compiler refinements. But surely as the above benefits accrue, further areas for refinement of the language will be revealed, and wholly new research areas will emerge and mature. The research community centered around Haskell has done much to create a language in which powerful abstraction tools can be brought to bear on real world problems. This research environment should continue, and requires a focused research forum for the same reasons Haskell was created in the first place. Perhaps what is needed are two tracks of language development, "Standard Haskell" and "Research Haskell". The research community continues to develop, distribute, and test new language concepts with less fear of disrupting existing users. After sufficient time the lessons from Research Haskell can be folded back into Standard Haskell. In fact, I remember hearing that intent around the time of Haskell 1.3 -- that the next language release wouldn't be until Haskell 2.0. That would permit interim stability and sufficient time for certain "big picture" problems (especially the module system) to be addressed. What happened here? Perhaps what is really required is just an enforced moratorium on new language releases for a specified time. A complete split bewteen Standard and Research Haskell groups may present a division-of-labor problem. In any case, it is time to again ask several basic questions: -- What is the Ultimate Purpose of Haskell? What are the community's long-term goals, the collected hopes and dreams, for the language? In The Beginning, that seemed to be as a focus for research into functional programming languages. Haskell was born and has since matured greatly. Now the decision must be made as to its desired role in human computing endeavors. -- In what ways must Haskell be simplified/changed/expanded in order to achieve the desired Ultimate Purpose? As an example, I would wish to see the power of Haskell brought to bear on systems programming. Moreover, I want powerful abstraction tools made available to the operating system designer and implementor. OS entities are usually untyped (i.e. just bits), or have an ad-hoc typing notion at best. Thus the issues that interest me most strongly are persistence, support for programming in the large, and dynamic typing. I would suggest some agreement be made on the appropriate big picture issues. Then a pact should be sealed between the Haskell research community and the user community that prevents a language release until good answers are found to the big picture issues. -- John Whitley
Re: Standard Haskell
On 20 Aug, Hans Aberg wrote: Is it not possible to make the versions upwards compatible, so that Haskell 1.4 code somehow can be run on Haskell 1.5? Does "being stable" need to mean unchangeable? Well, one way would be to require a directive at the head of every file saying (for example) Haskell 1.4 And then compilers could always say "Not interested in compiling Haskell 1.4 programmes"... I don't think I like this though: it's an extra feature, and the whole point of the standardisation effort is to replace features by orthogonality (I hope). Jon -- Jon Fairbairn [EMAIL PROTECTED] 18 Kimberley Road[EMAIL PROTECTED] Cambridge CB4 1HH +44 1223 570179 (pm only, please)