Re: Re[2]: [Haskell-cafe] funct.prog. vs logic prog., practical Haskell

2009-08-03 Thread Bill Wood
On Mon, 2009-08-03 at 00:24 +0400, Bulat Ziganshin wrote:
   . . .
 and the primary way to make haskell program faster is to emulate
 imperative language. and the best way to optimize C program is to use
 it as cpu-independent assembler.
 
 it's all natural in von-Neumann world and i personally don't buy these as
 arguments against everything developed since 1956
 
 actually it supports their's POV: LP is an FP plus non-determinism so
 why buy a part instead of whole thing? if they need to teach students
 how to optimize programs, Haskell will be out of luck anyway

This seems largely tangential to my point, which was that logic
programmers can well benefit from exposure to functional programming
since 1) one important approach to improving the performance of logic
programs is reduction of non-determinism, and 2) often times a
deterministic logic program is very similar to a functional program.

-- 
Bill Wood

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] funct.prog. vs logic prog., practical Haskell

2009-08-02 Thread Bill Wood
On Sun, 2009-08-02 at 12:25 +0200, Petr Pudlak wrote:

 (2) is harder for me, since I've never programmed in Prolog or another 
 language
 for logic programming. I'd be happy if anyone who is experienced in both 
 Prolog
 and Haskell could elaborate the differences, pros  cons etc.

I have done some real-world programming in Prolog and SML.  The
conventional wisdom in the LP community seems to be that the primary
path to performance improvement of logic programs is by reduction of
non-determinism.  To a first approximation, when you have eliminated all
the non-determinism from a logic program what you have left is a
functional program.  The work I was involved with, trying to get
quasi-real-time performance from Prolog, bore this out.

-- 
Bill Wood

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Doing some things right

2007-12-28 Thread Bill Wood
On Fri, 2007-12-28 at 20:23 -0700, Luke Palmer wrote:
   . . .
 OO is orthogonal to functional.  Erlang is pure functional, Lisp is a
 bastard child...

Give it its historical due, please -- bastard grandsire at least.

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] DSL question -- was: New slogan for haskell.org

2007-12-19 Thread Bill Wood
On Wed, 2007-12-19 at 13:03 -0500, Steve Lihn wrote:
   . . .
 In the Disadvantage section (near the end), there is an item -- hard
 or impossible to debug. Can anybody explain why or what it means? And
 how does it apply to Haskell?

Lisp has long been touted as a good language for developing a DSL, one
reason being it's macro system.  However, it has also long been realized
that DSLs implemented with macros can be extremely difficult to debug
because by the time bugs manifest the (macro) source is long gone and
its connection to the expansion text is difficult to determine.

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] New slogan for haskell.org

2007-12-13 Thread Bill Wood
On Thu, 2007-12-13 at 09:55 -0500, Denis Bueno wrote:
   . . .
 More importantly for this discussion, however: Fibonacci heaps have
 nothing to do with calculating the fibonacci numbers, and you don't
 even need to know what the fibonacci sequence is to use fibonacci
 heaps.  (You discover what it is, if you didn't know, when you do a
 complexity analysis of fibonacci heaps, but, that's only useful for
 proving how efficient the heaps can be.)  Therefore, I don't think one
 can successfully argue that fibonacci numbers are important because a
 heap has Fibonacci the name in its name.
 
 Just a nit, but I thought it worth mentioning. =]

That's fair, and Cormen et. al. said pretty much the same thing in Chap.
20.  I think the argument is that the Fibonacci sequence is important to
*understanding* the Fibonacci heap.  Still your point is well taken.

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] New slogan for haskell.org

2007-12-12 Thread Bill Wood
On Wed, 2007-12-12 at 11:19 +, Andrew Coppin wrote:
   . . .
 ...and normal programmers care about the Fibonacci numbers because...?
 
 Seriously, there are many, many programmers who don't even know what 
 Fibonacci numbers *are*. And even I can't think of a useful purpose for 
 them. (Unless you count Fibonacci codes?)

Knuth[1] pp. 417-419 discusses Fibonacci trees and Fibonacci search.
According to Knuth (and who am I to argue with him) Fibonacci search has
better average case running time than binary search, although worst case
can be slightly slower.

Cormen et. al.[2] devotes chapter 20 to Fibonacci heaps, which they say
are of primarily theoretical interest.

[1] Donald E. Knuth, The Art of Computer Programming, vol. 3, second
edition, Addison Wesley Longman (1998).

[2] Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest and
Clifford Stein, Introduction to Algorithms, second edition, The MIT
Press (2001).

 -- Bill Wood

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] More on Fibonacci numbers

2007-11-08 Thread Bill Wood
On Fri, 2007-11-09 at 02:38 +0100, Daniel Fischer wrote:
 Am Freitag, 9. November 2007 02:25 schrieb [EMAIL PROTECTED]:
 
  Since I have no idea what a real mail client is, you will not frighten
  me! My mail client is apparently complex.
 
 As long as it's not purely imaginary...

This is becoming surreal.

 -- Bill Wood

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Learn Prolog...

2007-09-03 Thread Bill Wood
On Mon, 2007-09-03 at 07:46 +0100, Andrew Cheadle wrote:
   . . .
 Incidentally, we've often seen a lot of traffic on here about Sudoku 
 solvers and I've always wanted to post the ECLiPSe solution
 (neat when you consider the length of the sudoku/2 predicate ;-) :

Reasonably quick, too -- 50 sudokus in ca. 0.1 sec. (I did a Project
Euler problem that required solving 50 sudokus with ECLiPSe).

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Learn Prolog...

2007-09-03 Thread Bill Wood
On Mon, 2007-09-03 at 02:49 -0400, [EMAIL PROTECTED] wrote:
   . . .
 I'm sure this isn't the case for you, but a typical Prolog programmer's
 idea of large is very different from a typical COBOL programmer's.

Ever the diplomat? :-).  Actually that is a fair observation.  I don't
think I ever heard a figure, but I would guess we were in the 20-50 KLOC
range.  The Prolog portion was solving an optimization problem.  The
problem was exacerbated by the fact that the code base was cloned onto
around 20 different machines that were being fed by a 1-to-20 split of a
stream of requests from the internet.  At the back end was FFI access to
a terabyte data store.  Oh by the way, the operational goal was
30-second turn-around from the client end -- submit a request from a
local office and have results 30 seconds later.

   . . .
 Did you look at Mercury?

I looked seriously at Mercury.  It was rejected for two
managerial/political reasons.  The first was that it did not appear that
Mercury could support the scale of the application, at least partially
because it appeared at the time that development was not very active.
The second reason was that the development group had only made the
transition from assembly language to Prolog within the past year or so,
and the prospect of pulling the group through another paradigm shift
made all the managers turn pale.

I sent some email to the Mercury site asking their opinion as to whether
it was up to the challenge; the response was a candid probably not.

I still sometimes think it might have worked, but the risks would have
been horrendous.

 -- Bill Wood

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Learn Prolog...

2007-09-02 Thread Bill Wood
As to whether Prolog is dead or not, it depends on your definition of
dead.  Three years ago (not ten!) I made my living maintaining and
developing a large application written in Prolog.  That was actually an
interesting experience, since one of the performance drivers was speed.
As a result code was being perpetually tuned toward less
non-determinism.  You know what the limit is?  Functional programming!
At the time I did a little research looking for an FP language that was
ready for prime time and for which the pain of the move for a large
organization would have been acceptably low.  Sadly, nothing came of it.
I still think the application could have been profitably ported to a
functional language.

Recently I have been experimenting with ECLiPSe, a Constraint Logic
Programming system embedded within standard Prolog.  I found several of
the problems in the Euler Project were perfect candidates for attack by
constraint programming.  Yes, I could have written solutions that
implemented some sort of search strategy, but there is a legitimate
question of leverage.  For example, Sudoku puzzles are very naturally
viewed as constraint programming problems.  But why should I write my
own solution when the Sudoku solver provided as a demo could solve 50
problems in 0.10 seconds!  By the way, I could get a fairly good idea of
the how and why of the problem nd its solution from the form of the
demo code, and oh yes I had already written my own Sudoku solver on
OCaml a year or so ago.

To Jerzy's point -- I strongly believe that learning a language like
Prolog is a good idea for two reasons -- first, it adds another tool to
the programmer's toolkit, and second, it enlarges the programmer's view
of ways to think about solving problems.

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Learn Prolog...

2007-09-02 Thread Bill Wood
On Mon, 2007-09-03 at 07:43 +0800, Hugh Perkins wrote:
 On 9/3/07, Derek Elkins [EMAIL PROTECTED] wrote:
  Because no one has said it quite this way:
  The modern equivalent of Prolog is Prolog.

I was just about to say the same thing :-); thanks, Derek.

   . . .
 (btw, just thought, when I was talking about FFI, probably meant
 Forth, not Prolog.  FFI for Prolog probably isnt that important.)

No, Foreign Function Interfaces are as useful with Prolog as with any
other high-level language.

(BTW I thought the FFI for Forth was the Forth assembler; have things
changed since FIG/F83?)

I just did a fast scan and found that XSB and SWI Prolog seem to be
still quite active.  If you have a few bucks (or euros) sicstus is also
available.

I was quite satisfied with XSB, though my experience is somewhat dated
now.  It is somewhat idiosyncratic (they're talking about getting closer
to ISO Prolog with their latest release).  I have also had good results
with SWI.  Both of them support some CLP libraries.

GNU Prolog is also out there, but I don't know how active development is
(please, I said I don't know, not that I thought it was becoming
moribund).  I've used it a little.  It also comes with something of a
CLP library.

It looks like you can get an individual license for sicstus for ca. 155
euros.  I used it a lot about three years ago and it seemed to be quite
stable, had good performance, and we received good support.  Of course
we were a big corporate customer.

Prolog seems to be quite alive and kicking.



___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Why monad tutorials don't work

2007-08-14 Thread Bill Wood
On Tue, 2007-08-14 at 16:02 -0700, Dan Piponi wrote:
   . . .
 On 8/14/07, Michael Vanier [EMAIL PROTECTED] wrote:
   I'm reminded of
  a physics teacher who was having a similar problem explaining the concept 
  of tensors, until he said
  that a tensor is something that transforms like a tensor does!.
 
 Grrr...must...hold...my...tongue...

Dan, as a former student of a clone of that physics teacher, I am really
interested in what you will say when you fail to hold your tongue.

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] A very edgy language

2007-07-06 Thread Bill Wood
On Sat, 2007-07-07 at 15:08 +1000, Donald Bruce Stewart wrote:
   . . .
  I don't know. #math is larger than #accounting. Is it because math is 
  more mainstream than accounting? I bet it is because math is more 
 
 math is more *interesting* than accounting? :-)

If we gotta have  a theory, I'll go with this one!

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] FOL

2007-06-05 Thread Bill Wood
On Tue, 2007-06-05 at 14:41 +1000, Tony Morris wrote:
 I would like to know if all 16 possible functions that accept two
 boolean arguments have names in First-Order Logic. I know they have
 Haskell function names (e.g. \p - \_ - id p, \_ - \_ - const True),
 but I'm hoping there is a more general name that is recognised by anyone
 with a feel for logic, but not necessarily Haskell.
 
 I have listed all sixteen of these functions below using Haskell (named
 a to p) along with the name of those that I am aware of having a name as
 a comment.
 
 Thanks for any tips.
 
 {-
 
 p q | a b c d e f g h i j k l m n o p
 0 0 | 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1
 0 1 | 0 0 0 0 1 1 1 1 0 0 0 0 1 1 1 1
 1 0 | 0 0 1 1 0 0 1 1 0 0 1 1 0 0 1 1
 1 1 | 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1
 
 -}

The best I've been able to come up with is names and connective phrases
for pronunciation for twelve.  I suspect that not enough people have
found a use for the others to warrant specific terms.

In the table below I've listed the twelve with name, phrase and
corresponding column of your table.  I've expressed the phrases in the
form P phrase Q to indicate (rough) pronunciation.

  NameConnectiveTable Column

Conjunction P and Q   b
Inclusive Disjunction   P or Qh
Exclusive Disjunction   P exclusive or Q  g
Conditional P only if Q   n
Biconditional   P if and only if Qj
Sheffer Stroke  P stroke (nand) Q o
Sheffer Slash   P slash (nor) Q   i
Inverse Conditional P if Ql
Tautology   True  p
Inconsistency   False a
Negative ConditionalP but not Q   c
Negative Inverse ConditionalQ but not P   e

As you can see from the table, Tautology and Inconsistency are rarely if
ever used as connectives.

I checked these in Carol Horn Greenstein, _Dictionary of Logical Terms
and Symbols_, Van Nostrand Reinhold Company, 1978.

 -- Bill Wood

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] ANN: Atom - Yet another Haskell HDL

2007-04-04 Thread Bill Wood
On Tue, 2007-04-03 at 23:18 -0500, Tom Hawkins wrote:
 Hi,
 
 Haskell has a rich history of embedded hardware description languages.
  Here's one more for the list.
 
 Inspired by the work of Arvind, Hoe, and all the sharp folks at
 Bluespec, Atom is a small HDL that compiles conditional term rewriting
 systems down to Verilog RTL.  In Atom, a circuit description is
 composed of a set of state elements (registers) and a set of rules.
 Each rule has two components: an enabling condition and a collection
 of actions, or state updates.  When a rule is enabled, it's actions
 may be selected to execute atomically.  In contrast to Verilog
 always blocks, multiple rules can write to the same state element.

Just curious, how does this relate to Guryevitch's Evolving Algebras
(renamed Abstract State Machines?)

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Channel9 Interview: Software Composability and theFu ture of Languages

2007-01-31 Thread Bill Wood
On Wed, 2007-01-31 at 08:45 +, Magnus Therning wrote:
   . . .
 Nneither way may be natural, but imperative thinking is extremely
 common in society, I'd say much more than functional thinking.  Just
 think of cooking recipes, IKEA instructions, all the algorithms tought
 in math classes in grade school.  I doubt that we'll ever see functional
 thinking tought alongside imperative thinking in lower grades in school.
 
 It could even be that the development of the human brain as we
 grow up reaches a state where it can handle imperative thinking before
 it can handle functional thinking (I bet there's a reason why astronauts
 have step-wise instructions rather than a set of functions).
 
 All I'm trying to say is that imperative thinking is so common outside
 of CS/math and we learn it so early on in life that we probably can
 consider it the natural thinking way.

I think you're quite right here.  There's a reason that so much of
Artificial Intelligence research dealt with Planning, Procedural
Knowledge, and Temporal Reasoning.  The so-called Frame Problem would
not have been such a research topic if it weren't for the importance
that many researchers placed on reasoning about state change.

On the other hand work on declarative knowledge focused on taxonomic
hierarchies and semantic nets, which dealt mostly with structural
knowledge.  A common view was that the two sorts of knowledge
complemented each other, and corresponded in a natural way to intuitions
(and some epistemological theory) about human reasoning.

 -- Bill Wood

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Channel9 Interview: Software Composability and theFu ture of Languages

2007-01-31 Thread Bill Wood
On Wed, 2007-01-31 at 19:51 +1100, Donald Bruce Stewart wrote:
   . . .
 foldl (\water dish - wash water dish) soapywater dishes :: [Dishes]

Nice example.  First, note that you can't get close with map -- you need
accumulation across the dishes.  Second, the correctness of this plan
depends on the rather strong frame axiom that no entity in the
environment is changed during a step of the fold, so no broken dishes.
Finally, that doesn't work so well when there are constraints on the
order that the dishes are washed, for example when washing the cruddiest
dishes first while there are more suds.

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] trivial function application question

2007-01-05 Thread Bill Wood
It would seem that for a regular expression facility to constitute a
parser it would have to be able to work on token streams. So my question
is, does either the the perl6 generalization or any of the Haskell regex
facilities support regular expressions over any free monoid other than
finite character sequences?

 -- Bill Wood



___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Is Haskell a 5GL?

2006-09-25 Thread Bill Wood
On Mon, 2006-09-25 at 16:50 +0200, Jerzy Karczmarczuk wrote:
   . . .
 (For the CDC mainframes the assembly language was called Compass
 [Comprehensive Assembler]. It was a 7648764GL. You could program
 in it almost like in Lisp thanks to some very exquisite macros,
 and add another set of macros which performed some type checking
 and optimized the register allocation. People who used it got
 famous (as M. Veltman, Nobel prize in physics), or got mad.)

And there was something exquisitely perverse about writing Compass
programs as Fortran subroutines so you could avoid writing IO in
Peripheral Processor code.

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Is Haskell a 5GL?

2006-09-25 Thread Bill Wood
On Mon, 2006-09-25 at 22:22 +0200, Christoph Herrmann wrote:
   . . .
 What Prolog really provides concerning automatic problem solving
 is little: equation solving in term algebra; you can simulate that
 in Haskell without much effort. On the other hand, I saw Haskell
 classified as a 3GL. The problem is that Haskell often exposes
 the algorithmic structure. What people often forget is that Prolog
 programs in non-trivial situations are likely to fail if you do
 not prescribe the evaluation order, by features like the cut which
 destroy the declarative semantics. People also forget that arithmetic
 constraints in Prolog have to be directed (input/output variables),
 no difference to Haskell.

I spent some time working on a large Prolog application where
performance was critical, and it became obvious that when a Prolog
program is tuned by removing non-determinism it moves towards a
functional program.  Any real (non-textbook example) Prolog program has
to expose algorithmic details simply because the programmer must a) make
decisions and b) express them.  I think you're right that Haskell should
be in the same bag as Prolog.

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Haskell web forum

2006-09-21 Thread Bill Wood
I have only recently started accessing some web fora, but I've noticed
that some of those powered by phpBB are vulnerable to spamming,
whereas the news groups seem to be less so.  For example, the
python-forum has nearly lost it's General forum to spammers.  Maybe
the experts know better engines, better ways to set up a forum, or
better ways to administer them after they're up, but it is a concern.

 -- Bill Wood



___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: sections of noncommutative operators

2006-09-09 Thread Bill Wood
On Sat, 2006-09-09 at 11:17 +0100, Jón Fairbairn wrote:
   . . .
 I should think so. But does lisp have currying these days? 
 (lessp 0 1) == T
 but (lessp 0) would be an error, wouldn't it?

For Scheme, R5RS, Section 6.2.5 specifies that  and  take two or
more arguments, and PLT Scheme raises an exception on ( 0).

For Common Lisp, CLtL2 specifies for =, /=, . , =, = that These
functions take one or more arguments (p. 293).  ( x1 ...) is true
when the arguments form a monotonically increasing sequence, and clearly
a singleton sequence is monotonic.  In the CLISP implementation of
Common Lisp ( 0) returns T.  BTW there is no lessp in CL.

As you noted, CL and Scheme do not support currying.  However, SRFI 16
(Scheme Request for Implementation, a quasi-standard mechanism for
introducing extension libraries into Scheme) provides a Notation for
Specializing Parameters without Currying as a syntactic sugar for
nested lambdas.

 -- Bill Wood

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] head as a total function

2006-09-07 Thread Bill Wood
On Thu, 2006-09-07 at 11:03 +0400, Bulat Ziganshin wrote:
   . . .
 it was the first imperative language supporting closures, after all

Uh, what about lisp?  The (MIT) lisp 1.4 manual (ca. 1965) refers to
FUNCTION differing from QUOTE in that it handled free variables
correctly; I take this to mean that at least a primitive form of
closure was provided.  Moreover, a language that provides SET/SETQ,
RPLACA/RPLACD and the PROG feature (including labels and a goto) surely
qualifies as imperative?

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: ANNOUNCE: HNOP 0.1

2006-07-01 Thread Bill Wood
On Fri, 2006-06-30 at 21:20 -0700, Bjorn Bringert wrote:
   . . .
 I guess this is related to the issue of exception handling. What if  
 the program is interrupted when it's halfway through doing nothing?  
 Ideally we would like to guarantee that it does nothing atomically.

This may be difficult to guarantee.  Consider:

1.  If it does nothing atomically then it does nothing atomically.

2.  But if it does nothing atomically then it is false that it does
nothing atomically.

3. Ergo it is false that it does nothing atomically, from (1) and (2)
   by Reductio ad Absurdum.

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: how to write an haskell binding

2006-06-27 Thread Bill Wood
On Tue, 2006-06-27 at 13:35 -0700, Jeremy Shaw wrote:
   . . .
 How about heir? Also, until recently, herb and humble?

I grew up in the southern US, and I was taught 'herb' with silent 'h'
but 'humble' with aspirated 'h'.  With the 'h' silent 'humble' sounds
very Dickensian to my ear.

-- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Functional progr., infinity, and the Universe

2006-06-23 Thread Bill Wood
On Fri, 2006-06-23 at 09:38 -0400, Paul Hudak wrote:
   . . .
 But the limit of a chain IS the maximal element of the set of all 
 elements comprising the chain, since the LUB, in the case of a chain, is 
 unique, and thus we don't have to worry about choosing the least 
 element (i.e. it reduces to the upper bound, or maximal element).

There must be something additional going on, since in general the fact
that an ordered subset of a set has a LUB in the set does not imply that
the LUB is in the subset.  For example, the subset {1 - 1/n : n in N+}
of Q has LUB = 1, but 1 is not an element of the subset.  It would seem
that while the infinite list is the LUB of the chain of finite lists, it
is not itself a member of the chain of finite lists.  So, what am I
missing?

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Functional progr., images, laziness and all therest

2006-06-22 Thread Bill Wood
On Thu, 2006-06-22 at 15:16 +0100, Brian Hulley wrote:
   . . .
 But how does this change the fact that y still has 1 more element than yq?
 yq is after all, not a circular list.
 I don't see why induction can't just be applied infinitely to prove this.

The set of all non-negative integers has one more element than the set
of all positive integers, however they have the same cardinality,
aleph-null.  This phenomenon is the hallmark of infinite sets.

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Functional progr., images, laziness and alltherest

2006-06-22 Thread Bill Wood
On Thu, 2006-06-22 at 20:13 +0100, Brian Hulley wrote:
   . . .
 filter function),  I can reason about my programs without entering the areas 
 of maths that I don't believe in, which is one more reason for my desire to 
 have a totally strict version of Haskell ;-)

This may also explain why those who *do* believe in some of those maths
resist the move to totally strict Haskell :-).

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] | vs. $ (was: request for code review)

2006-03-06 Thread Bill Wood
On Mon, 2006-03-06 at 11:25 -0800, Shannon -jj Behrens wrote:
   . . .
 I find ctx | currTok | tokenType to be more readable than
 tokenType $ currTok $ ctx because you're not reading the code in
 reverse.  That's my primary complaint with . and $.  That's
 especially the case when I'm spreading the code over multiple lines:

(Just my $0.02 worth; no flames please :-)  I notice that all but the
last of your four functions read like commands in English.  It seems
natural to write a sequence of commands left-to-right in the order they
are to be performed, so | seem natural and $ and . seem backward.
However, if the chain f $ g $ h x is read something like the result
of applying f to the result of applying g to the result of applying h to
x then the usual order seems more natural.

And my preferences differ from Udo's -- I would not like to see the
order of args to . and $ reversed *unless* the arg was written to
the left of the chain, ala x h $ g $ f, as is done by some
algebraists.

It does seem that the whole controversy boils down to how the writer
thinks of the chain -- as a sequence of actions or as the evolution of a
value.  Neither one is the One True View; they're just different.

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Why is $ right associative instead of left associative?

2006-02-05 Thread Bill Wood
On Sun, 2006-02-05 at 13:49 +0100, Tomasz Zielonka wrote:
   . . .
 and I want to add some transformation between g and z I have to
 change one line and insert another
 
 f x y .
 g x .
 h x y $
 z

 With right-associative $ it would be only one line-add. Probably not a
 very strong argument.

Maybe stronger than you think.  I know that one of the arguments for
making ; a C-style delimiter rather than a Pascal-style separator is
that adding a new statement at the end of a series is error-prone -- one
tends to forget to add the ; in front of the new statement (and one
reason Pascal syntax included the null statement was so that s1;
would parse as s1; null, making ; a de facto delimiter).

Editing ease matters more than a little.

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Why is $ right associative instead of leftassociative?

2006-02-04 Thread Bill Wood
On Sat, 2006-02-04 at 23:34 +, Chris Kuklewicz wrote:
   . . .
 But this implies [a,b,c,[]..] is the same as [a,b,c] and [a,b,c,[d,e,f]..] is
 the same as [a,b,c,d,e,f] and [a,b,c,[d,e,f..]..] is [a,b,c,d,e,f..]

Hmmm, does this get us to difference lists ala Prolog?

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Type Eigenvariable [was: Simple IO Regions]

2006-01-19 Thread Bill Wood
On Thu, 2006-01-19 at 19:18 -0800, [EMAIL PROTECTED] wrote:
   . . . 
 _variables_. The paper argues, btw, for the separation of those senses
 and for a new quantifier to mean uniqueness only. In short, when we
 write
   forall x. forall y. body
 then x may well be equal to y (in body). And so,
   forall x. forall y. B[x,y]  forall z. B[z,z]
 OTH, when we write |nabla x. nabla y. body| then x and y are
 guaranteed to be different, and so the implication above no longer holds.

OK, I gotta ask: is |nabla x, nabla y. phi(x,y)| logically equivalent to
|forall x, forall y. x  y only-if phi(x,y)|?  I use |P only-if Q| for
|P materially implies Q|.

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Haskell Speed

2005-12-29 Thread Bill Wood
On Thu, 2005-12-29 at 15:56 -0500, Albert Lai wrote:
   . . .
 one-pass fast method, since for almost a decade no one did it.  Most
 of us had to wait until someone figured it out and then we had Turbo

Judging from comments by U. Ammann [1],[2], part of the original Pascal
implementation team at ETH Zurich, the first Pascal compiler for the CDC
6x00 family of computers was written in 1970-1971, and a second was
developed from scratch for the same computers but compiling the revised
language, was written in 1972-1974.  I think both of these compilers
were one-pass.

 -- Bill Wood

[1] U. Ammann, The Zurich Implementation, in D.W. Barron (ed.),
_Pascal, The Language and its Implementation_, pp. 63-82, John Wiley
and Sons, 1981.

[2] U. Ammann, Code Generation for a Pascal Compiler, in D.W. Barron
(ed.),  _Pascal, The Language and its Implementation_, pp. 83-124,
John Wiley and Sons, 1981.


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Tutorial uploaded

2005-12-21 Thread Bill Wood
On Wed, 2005-12-21 at 13:10 +0100, Udo Stenzel wrote:
   . . .
 tutorial.  Before that, IO should be restricted to the EVA principle
 (Eingabe, Verarbeitung, Ausgabe == Input, Processing, Output).  It
 was a good principle in the 60s, and it still is.  Unless you want to
 teach people to program as they would do in Basic, that is.

Don't forget that the pioneers of IPO (Input,Process,Output) quickly
went to HIPO (Hierarchical IPO).  My natural design style is top-down,
functional decomposition, separation of concerns, all the good things.
Many problems involve relatively complex mixtures of IO and processing,
which I can capture fairly naturally with programs whose IO is
distributed over various nodes low in a functional decomposition tree.
It often feels like I'm turning a program inside-out to have the IO
monad at the top with pure functional snippets called at various points
in a do sequence.  And if the only way to express my decomposition
tree is with a tree of imperative code inside the IO monad, then I
start to ask, Why am I here?  I can write this in Scheme or Ocaml.

Since I'm a Haskell novice, I'm well aware that I may totally
misunderstand the situation.  However, I hadn't seen anything in
tutorials that would lead me to think so.  That said, one of the great
effects of these recent threads is that they've alerted me to the fact
that apparently the available tutorial information has expanded and been
improved since last I looked.  So I think I shall now go do some deep
browsing; thanks for the links.

 -- Bill Wood

PS:  While looking over my post it occurred to me that the issue is at
least as much methodological as it is linguistic.  So I ask:  Does
Haskell stand far enough apart from other programming languages to
warrant adapting standard methodological principles to it?  Is there an
identifiable Haskell Way?

 -- bw


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Prime numbers

2005-12-20 Thread Bill Wood
On Tue, 2005-12-20 at 16:53 +, Daniel Carrera wrote:
 Henning Thielemann wrote:
 factors :: Integer - Integer - Bool
 factors m n | m == n = False
  | m  n  = divides m n || factors (m+1) n
  
  
  Btw. I find the recursion harder to understand than the explicit
  definition:
  
  factors n = any (divides n) [2..(n-1)]
 
 For what it's worth, I also found this function un-intuitive. What I'd 
 expect a function called factors to do is exactly what yours does, and 
 not what the one in the example does.

It helped me to read factors m n as there is an integer between m and
n-1 inclusive that divides n.  Then the recursion made sense.
Thielemann's factors n would read there is an integer between 2 and
n-1 inclusive that divides n.

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Tutorial uploaded

2005-12-20 Thread Bill Wood
On Tue, 2005-12-20 at 20:07 +0100, Sebastian Sylvan wrote:
   . . .
 I'm still looking for a good *practical* tutorial that I could
 recommend to newcomers.
 IO, data types and QuickCheck in the very first chapter, I say! Real
 program examples from the get go, and go into the theory on why this
 has been hard in FP before Haskell (or Monadic IO rather) much much
 later, so as to not scare people away.

I second this motion.  I've been interested in Haskell for some time,
experimented with it several years ago, and then moved on to other
things.  That first time around I saw no comprehensible discussions of
monadic IO (for example, I thought that the fact that when you're in a
monad you can't get out was characteristic of monads in general, not a
peculiarity of IO).  The state of available knowledge for beginners has
clearly improved since I last looked.  I applaud this discussion for
making Haskell more accessible for the newbie.

(Now, if someone would just explain how to get reliable performance
information while jumping through only a bounded number of hoops ... :-)

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: Re[2]: [Haskell-cafe] Monads in Scala, XSLT, Unix shell pipes was Re: Monads in ...

2005-11-27 Thread Bill Wood
(I'm going to do a lazy permute on your stream of consciousness; hope it
terminates :-).

I think the Rubicon here is the step from one to many -- one
function/procedure to many, one thread to many, one processor to
many, ... .  Our favorite pure functions are like the Hoare triples and
Dijkstra weakest preconditions of the formal methods folks in that the
latter abstract from the body of a procedure to the input-output
relation it computes; both the function and the abstracted procedure are
atomic and outside of time.  After all, aren't referential
transparency and copy rule all about replacing a function body with
its results?  Well, as soon as there are two or more
functions/procedures in the same environment, the prospect of
interaction and interference arises, and our nice, clean,
*comprehensible* atemporal semantics get replaced by temporal logic,
path expressions, trace specifications, what have you.  Some notion of
process is inevitable, since now each computation must be treated as an
activity over time in order to relate events that occur doing the
execution of one computation with the events of another.

Functional programming gives us the possibility of using algebra to
simplify the task of reasoning about single programs.  Of course,
non-functional procedures can also be reasoned about algebraically,
since a procedure P(args) that hammers on state can be adequately
described by a pure function f_P :: Args - State - State.  The problem
is, of course, that the state can be large.

But the functional paradigm offers some hope for containing the
complexity in the world of many as it does in the world of one. I think
combining formalisms like Hoare's CSP or Milner's CCS with computations
gives us the possibility of doing algebra on the temporal event
sequences corresponding to their interactions; the hope is that this is
simpler than doing proofs in dynamic or temporal logic.  Using
functional programs simplifies the algebraic task by reducing the size
of the set of events over which the algebra operates -- you consider
only the explicitly shared parameters and results, not the implicitly
shared memory that can couple non-functional procedures.

It is conceivable that you can get your compositionality here as well.
Suppose we package computations with input-output parameter
specifications and CSP-like specifications of the pattern of event
sequences produced when the computation executes.  It may be possible to
reason about the interactions of the event sequences of groups of
packages, determine the event sequences over non-hidden events provided
by the composite system, etc.

As far as Bulat's comment goes, I'm mostly in agreement.  My dataflow
view was really driven by the intuition that a functional program can be
described by a network of subfunctions linking outputs to inputs; cross
your eyes a little and voila!  A dataflow network.  And if we're smart
enough to make a compiler do that, why bother the programmer?  But
you're not talking about analyzing a function into a
parallel/concurrent/distributed implementation; rather, you're
interested in synthesizing a temporal process out of interacting
computations.

The temporal aspect won't go away.  And that's the problem.

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: Dataflow and Comonads was Re: [Haskell-cafe] Monads in Scala, ...

2005-11-26 Thread Bill Wood
On Sat, 2005-11-26 at 18:43 +0100, Shae Matijs Erisson wrote:
 Greg Woodhouse [EMAIL PROTECTED] writes:
 
  Maybe this is a different topic, but exploring concurrency in Haskell is
  definitely on my to do list, but this is really a bit of a puzzle.  One
  thing I've been thinking lately is that in functional programming the 
  process
  is really the wrong abstraction (computation is reduction, not a sequence of
  steps performed in temporal order). But what is concurrency if their are no
  processes to run concurrently? I've beren thinking about action systems 
  and
  non-determinism, but am unsure how the pieces really fit together.
 
 Maybe dataflow languages like Lucid Synchrone, Esterel and Lustre?
 http://en.wikipedia.org/wiki/Dataflow_language
 http://en.wikipedia.org/wiki/Synchronous_programming_language

I also think dataflow is a very nice model for concurrency in pure
functional languages.  You might want to look at some of the work done
by Edward Lee's group in the EE Dept. at Berkeley.  They developed a
generalization of dataflow called token flow which is quite nice.  It
became one of the basic models supported by their heterogeneous
simulation systems, Ptolemy and Ptolemy II.  I seem to recall that one
student at a Texas university did a PhD thesis on dataflow for Haskell
and then went to work on token flow in Lee's group.

The one downside I found to using dataflow was that most software people
seem to be uncomfortable with the lack of identifiable processes doing
significant bits of work.  I guess if they they're not floundering
around in mutual exclusion, semaphores, deadlock detection and all the
other manifestations of unmanaged complexity, they don't feel they've
*accomplished* anything (BTW I grew up on Dijkstra, Hoare and Hanson, so
I can get away with saying this :-).  Interestingly enough, and perhaps
obvious in retrospect, I often found hardware designers to be very
comfortable with dataflow computations.

Does anyone know whether much research has been done on using dataflow
for concurrency in pure functional computation?  How does it mesh with
evaluation strategies?  And, of course, the flip side: has anyone looked
at re-examining real-world problems, conventionally handled with
coordinating/communicating processes, as dataflow computations?

The latter consideration may be the toughest to deal with.  I was
working on a project to handle PHY level communication algorithms with
dataflow, and I had a great deal of difficulty with the comma-detect
function of XAUI.  I made some progress, but I never found a
satisfactory solution.  On the other hand, I did work out a fairly nice
dataflow algorithm for CRC32.

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


RE: [Haskell-cafe] Function application like a Unix pipe

2005-11-23 Thread Bill Wood
On Wed, 2005-11-23 at 08:55 -0800, Scherrer, Chad wrote:
   . . .
 I see. I like the argument order also, since it so nicely reflects
 mathematical notation. But I do think there's a place for (flip ($)) and
 (flip (.)). The problem is that the assignment of fixities is much more
 subtle and requires a consideration of what should be considered proper
 style.

Interesting note: in Richard Bird and Oege de Moor, _Algebra of
Programming_, pp. 2-3, the authors write

   As a departure from tradition, we write f : A - B rather than
   f : B - A to indicate the source and target types associated
   with a function f. ... The reason for this choice has to do with
   functional composition, whose definition now takes the smooth
   form: if f : A - B and g : B - C, then f . g : A - C is defined
   by (f . g) x = f(g x).

Further along the same paragraph they write:

   In the alternative, so-called diagrammatic forms, one writes
   x f for application and f ; g for composition, where
   x (f ; g) = (x f) g.

I know I've read about the latter notation as one used by some
algebraists, but I can't put my hands on a source right now.

I guess it's not even entirely clear what constitutes mathematical
notation. :-)

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


RE: [Haskell-cafe] Function application like a Unix pipe

2005-11-23 Thread Bill Wood
On Wed, 2005-11-23 at 17:47 +0100, Henning Thielemann wrote:
   . . .
   Why is there no () and why is (=) not the default? The order of 'do 
 {a;b;c}' is compatible with that of (). So we have the fundamental 
 conflict, that usually function application is from right to left, but 
 interpreting imperative statements is from left to right.
   I think that's a similar conflict like that of little endian and big 
 endian.

There may be something to your functional/imperative conflict.

I had occasion to develop a computational model of a theoretical
state-transition machine that the inventors wanted to be able to
program.  My model used the standard trick of construing a parameterized
operation as a function f : Arg1 - ... - Argn - State - State.  By
writing the (ML) code for the instructions in the curried form and using
a reversed composition operator, I was able to provide a programmatic
interface which could be laid out one instruction per line with the
composition operators way off to the right in the comment column, just
like assembler code!  The inventors thought this was just wonderful
(there's no accounting for taste, I guess :-).

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Function application like a Unix pipe

2005-11-23 Thread Bill Wood
On Wed, 2005-11-23 at 21:21 -0500, Cale Gibbard wrote:
   . . .
 Hmm, which edition? My copy (5th ed.) uses the ordinary notation: f(x).
 
 x f does perhaps make more sense, especially with the current
 categorical view of functions, but there would have to be a really
 hugely good reason to change notation, as almost all current work puts
 things the other way around.

My copy reads
   First published 1997 by
   Prentice Hall Europe
   (Address lines)
   Copyright Prentice Hall Europe 1997
   ...
   ISBN 0-13-507245-X

It's hardcover, white cover with red around the spine (standard for this
series edited by C.A.R. Hoare), black banner with 100th title in red.
The lack of any edition information leads me to surmise it's a first
edition.

Do you (or anyone) know if the diagrammatic notation has any currency
among algebraists?

 -- Bill Wood


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Trapped by the Monads

2005-09-20 Thread Bill Wood
   . . .
 What struck me was this bit of code:
 
 assemblyLine w = (return w) = makeChopsticks = polishChopsticks = 
 wrapChopsticks
 
 
 Interestingly, this looks like Forth (!), where you put a value on the 
 stack, and successive operations fiddle with the stack as a series of 
 transformations. Not that I know Forth, you understand. Hmm, so Haskell 
 can be a concatenative language if you want it to be.

Some time ago I had occasion to model a special-purpose machine in SML,
and the potential users wanted a programmatic interface that looked like
an assembly language for the machine.  I modeled the instructions as
curried functions with the machine state as the last parameter and
return value, and defined a reverse compose function -- (f  g) x ===
g (f x).  This allowed me to write programs with op codes and parameters
running down the page, just like real assembler (I tabbed over to place
the  so they kinda hung out in the comment area so as not to spoil
the illusion).  It was a quick 'n dirty hack that turned out to be
pretty slick.

 -- Bill Wood
[EMAIL PROTECTED]




___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Trapped by the Monads

2005-09-20 Thread Bill Wood
   . . .
 Another thing I noticed in my nano-experience of Haskell is the Maybe 
 monad. This is interesting because it's a bit like a hybrid variables. 
 If you look at a book like Writing Solid Code (or is it Code 
 Complete, I can't remember now) which examine C style, they basically 
 scorn the use of hybrid variables. However, I read in something like 
 Thinking Forth (or maybe it was just a comment I saw atrributed to 
 Charles Moore, the inventor of Forth), who actually advocated hybrid 
 variables.

Could you briefly elaborate on what you mean by hybrid variables?

 It would be interesting to see how far the notion of Haskell as Forth 
 can go. Can Haskell make a better Forth than Forth can, or does it miss 
 some things which are quite natural in Forth.

I've always thought that there was a pretty natural correspondence
between Forth and FLs (functional languages); words that pop args from
the stack, compute, and push the results correspond to pure code, and
words that do IO, fetch and assign variables, etc. correspond to the
imperative code.

The facts that 1) the innards of the Forth machine are exposed to the
programmer and 2) that everything in Forth is a word make seamless
language extension easy; I don't think the same can be said for FLs.

On the other hand, recursion is difficult in Forth, and
interpretation-as-you-read make higher-order functions difficult (I
never saw any serious attempts at HOFs).

I suspect it's a levels-of-abstraction thing; after all, the Forth
environment could be viewed as the ideal stack machine upon which to
implement FLs and block-structured languages.

 -- Bill Wood
[EMAIL PROTECTED]


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Trapped by the Monads

2005-09-20 Thread Bill Wood
   . . .
 The typical example in C is:
  mem = malloc(1024)
 Malloc returns 0 to indicate that memory cannot be allocated, or a 
 memory address if it can. The variable mem is a so-called hybrid 
 variable; it crunches together 2 different concepts: a boolean value 
 (could I allocate memory?) and an address value (what is the address 
 where I can find my allocated memory).

An infamous example would be the convention in Common Lisp that nil, the
empty list, is also false for conditionals while anything else is
true for conditionals.  So the member function ('a * ['a] - ['a]) can
be used either as a predicate or a function returning a useful value.

 It's considered a bad idea because it makes it easy for programmers to 
 use the value inappropriately - witness the number of programmers who 
 pass in 0 as a memory location. The suggested solution is to give each 

And the Scheme community chose #f and #t for boolean values so you
had to be a little more explicit about what you were doing.

I mostly agree with the tightening-up, but there are times when I
really miss the nil hacks :-)

 -- Bill Wood
[EMAIL PROTECTED]


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell] Re: [Haskell-cafe] Haskell versus Lisp

2005-09-20 Thread Bill Wood
   . . .
 I was hoping that the examples I requested would be examples of  
 particular control constructs or extensions to the language's syntax  
 and semantics.  Though I admit that such things are possible in lisp,  
 I suspect that their utility is minimal.

As to utility, quite the contrary, I think.  Offhand I can think of the
screamer package for Common Lisp, which provides non-deterministic
mechanisms for use in backtracking applications.  For a while in the
80's there was practically a cottage industry implementing various
flavors of Prolog and other Logic Programming languages in Lisp; one
notable example was LogLisp.  I think many of the more advanced
constructs in CL were originally macro extensions to the earlier lisps;
e.g. structures, objects and classes, the LOOP macro, streams and
iterators, generalized setters and getters.

Actors, which was one of the ancestors of OOP, was first as a Lisp
extension.  In the AI hayday of the mid-80's most of the expert system
shells, providing forward and backward chaining mechanisms, frames and
semantic nets, and object-centered and data-driven programming, were
originally implemented as packages integrated into Lisp.

All of these made non-trivial extensions to Lisp, and all were of
arguably great utility.

 -- Bill Wood
[EMAIL PROTECTED]


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [OT[ proving programs for novices

2003-03-18 Thread Bill Wood
  . . .
 For imperative programming:
 
 D. Gries, The Science of Programming. Springer Verlag, New York, 1981.
 
 E.W. Dijkstra, A Discipline of Programming. Prentice-Hall, 1975.

These are two excellant sources; I've learned from each and taught from
each.
However, they are both a bit stiff for the student with little
background in
logic or mathematics.  Several texts did come out in the late 80's that
taught
the same approach from a more elementary starting point.  Three such are

E.W.Dijkstra and W.H.J. Feijen, A Method of Programming,
Addison-Wesley, 1988,
ISBN 0-201-17536-3

Geoff Dromey, Program Derivation/The Development of Programs from
Specifications, Addison-Wesley, 1989, ISBN 0-201-41624-7

Edward Cohen, Programming in the 1990s, Springer-Verlag, 1990,
ISBN 0-387-97382-6

 For functional programming:
 
 R. Bird, Introduction to Functional Programming using Haskell,
 2nd edition. Prentice-Hall, 1998.

I'd like to hear abut more sources here as well.  I've started in on

Richard Bird and Oege de Moor, Algebra of Programming, Prentice
Hall,1997,
ISBN 0-13-507245-X

but it is hardly elementary!

Another very interesting text is

John Cooke, Constructing Correct Software/the basics,
Springer-Verlag,
1988, ISBN 3-540-76156-X

which almost combines imperative and functional programming (and
logical)
by presenting a method of transforming (logical) specifications through
functions into imperative programs.

 -- Bill Wood
[EMAIL PROTECTED]
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: speedup help

2003-03-07 Thread Bill Wood
  . . .
 
  Oleg's blew the heap at 847; mine valiantly struggled on 'til it blew
  the heap at 1910.
 
 Hmm, I managed to compute bernoulli 2000 and even bernoulli 3000. The
 code is included. It took 7 minutes (2GHZ Pentium IV, 1GB memory) to
 compute bernoulli 2000 and 33 minutes for bernoulli 3000. I monitored
 the memory usage of the compiled application using top and found that
 the resident set stayed at 30MB, which is a little bit less than the
 resident set for Emacs. My emacs has a dozen of open windows, and has
 been running for a month. Just for the record, here's the result of
 bernoulli 3000:
 
 (-2891939 ...6744 other digits... 81) % 12072109463901626300591430

Well, kudos to Oleg!  I just ran his code from the msg this one replies
to and got similar results:  On a 650 Mhz Athlon with 128MB RAM I
compiled with ghc -O (GHC 5.00.1).  Using the time command,
bernoulli 2000 took 490 sec. user time while bernoulli 3000 took 2170
sec.
user time.  Notice there were no heap overflows this time.  I hope
someone can explain the differences in space behavior between the
version
in Oleg's mail of March 6 and this version.  I'm surprised we are as
close
as we are in time given that my processor is less than half as fast.  I
would also expect that my having 1/8-th the memory he has would slow me
down more due to page faulting.

The current version also fixes a minor glitch: the earlier version gave
B_0 = 0 instead of 1.

I think I got the right results for B_3000: My print-out had the same
denominator along with a 6762-digit numerator with the same initial
seven and final two digits.  I don't get 6744 digits in the middle,
however.

I'm impressed by the good performance characteristics of high-level
Haskell code.

Nice work Oleg,

 -- Bill Wood
[EMAIL PROTECTED]
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: speedup help

2003-03-06 Thread Bill Wood
Oleg has a very interesting approach; in particular, he avoids
explicit recursion and (most) computing with rationals, while
also replacing the factorials in binary coefficients by using
successive rows of Pascal's triangle. He also skips over the
B_{2k+1}, k  0, which are all 0.

I slogged through the standard expansions, deriving a tail
recursive function that rolls along successive Bernoulli numbers,
generating successive rows of Pascal's triangle along the way,
and returning the list of B_n .. B_0.  You can think of the list
of Bernoulli numbers as memoizing just-in-time.

Running it in Hugs on a 650Mhz Athlon with 128M RAM, bernoulli 82
took ca. 1 sec.  Compiling with ghc -O, bernoulli 1000 took approx.
15 sec. wall time; bernoulli 1 blew the heap.

I couldn't get getCPUTime (from module CPUTime) to work for me; if
anyone can enlighten me on how to get timing of function execution
I'd appreciate it.

BTW profiling didn't work; when I tried to compile with profiling
flags I received error msgs saying that interface files for standard
libraries couldn't be found.  Compiling without the flags seems to
work just fine.

Oleg's program brings up another interesting point for all you
mathematicians out there: I found two basically different expansion
formulas for Bernoulli numbers.  One is based on the formal
expansion of the equation

(B + 1)^n = B^n

which results in binomial-theorem expansions all of whose terms are
positive.  The other is based on formal expansion of the equation

(B - 1)^n = B^n

which results in binomial-theorem expansions whose terms alternate
in sign.  The amazing thing is, they two sets of numbers only differ
at one term: the first formula results in B_1 = -1/2 while the
second results in B_1 = +1/2 !

I found the second formula in Conway  Guy, _The Book of Numbers_,
p.108.
The second formula, with tiny variations, can be found in Knuth,
_Fundamental Algorithms_, p. 109, Abramowitz  Stegun, _Handbook
of Mathematical Functions_, p. 804 and Song Y. Yan, _Number Theory for
Computing_, p. 75

This has gotten a little long, sorry.  If you want I can post my Haskell
code or send privately.

 -- Bill Wood
[EMAIL PROTECTED]
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: speedup help

2003-03-06 Thread Bill Wood
  . . .
 This code seems to compute 'bernoulli 82' in less then a second, in
 Hugs (on a 2GHz Pentium IV).

Just a note:  I compiled and ran Oleg's and mine for comparison.  The
two seem to be of the same complexity, with Oleg's a little faster
(modulo my using wall time; see previous msg.)

Oleg's blew the heap at 847; mine valiantly struggled on 'til it blew
the heap at 1910.  I must be doing something right, since I'm carrying
around the list of all numbers from B_0 through B_n, while Oleg cleverly
avoids that.  I was also surprised to see Oleg's blow the heap at an
*odd* value -- I thought he skipped those.

 -- Bill Wood
[EMAIL PROTECTED]
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe