Re: [Haskell-cafe] Is Curry alive?

2010-11-05 Thread wren ng thornton

On 11/4/10 6:32 PM, Gregory Crosswhite wrote:

On 11/04/2010 03:06 PM, Dan Doel wrote:

Implementing type inference can be very easy in a logic language,
because most
of the work in a non-logic language is implementing unification:
[...]


Cool! Thank you very much; that is exactly the kind of thing I was
looking for. :-)


And for constraint languages just think of, er, constraint problems.

That is, I'm running a factory that needs to make a bunch of different 
widgets, but different workers/stations only know how to make certain 
widgets, or certain workers are only available for so many hours a day 
or certain days of the week, and certain widgets are needed in order to 
make other bigger widgets. What is the best scheduling of jobs to 
maximize my productivity?


Or to pick a less practical (but no less actual) example: you're a 
crossword puzzle designer. You have a big dictionary of words and you 
need to come up with a 32*32 board where all the words intersect in the 
right way. Oh, and it should be a pretty symmetric design too.


Some formalisms for parsing (or generating) natural language work by 
declaring a bunch of constraints on how words can interact, rather than 
postulating a grammar of tree structures. In certain ways it's a lot 
nicer, right? You just put down rules like adjectives precede the nouns 
they modify, and that's that; no need to worry about left corner 
transforms, or making your grammar deterministic, or unary rule cycles, 
or the other traps awaiting an unwary CFG designer.


Or, I have a system of linear inequalities and I want to find the 
solutions. That is, the interesting ones: the boundary solutions.



In a language like ECLiPSe you just need to state whatever the 
constraints are and then hit the START button. In general you shouldn't 
have to worry about how the runtime searches for answers, though 
worrying about that is how you do optimization in constraint programming.


--
Live well,
~wren
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Is Curry alive?

2010-11-05 Thread Ross Paterson
On Thu, Nov 04, 2010 at 06:06:40PM -0400, Dan Doel wrote:
 Implementing type inference can be very easy in a logic language,
 because most of the work in a non-logic language is implementing
 unification:

Provided the implementation includes the occurs check.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Is Curry alive?

2010-11-04 Thread Sebastian Fischer


On Nov 4, 2010, at 2:07 PM, wren ng thornton wrote:

Besides, it's simple enough to just use the MonadLogic class and  
switch between concrete types, if you need to test performance.


You may even try to use only `MonadPlus` to get more instances. The  
only instances of `MonadLogic` are (basically) `[]` and `LogicT` but  
there are many more `MonadPlus` instances on Hackage. If you don't  
need the flexibility of `LogicT` I recommend using `MonadPlus`.


The fmlist package provides a list datatype that is similar to LogicT  
and an instance of `MonadPlus`. stream-monad is a surprisingly space  
efficient way to approach infinite search spaces but is also worth  
trying for finite searches. If you want to go parallel try tree-monad  
in combination with parallel-tree-search. And if stream-monad still  
uses too much memory for an infinite search space, try iterative  
deepening search from level-monad.


Sebastian

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Is Curry alive?

2010-11-04 Thread Gregory Crosswhite

 On 11/02/2010 08:37 PM, wren ng thornton wrote:
Indeed. If your program requires unification or constraint solving 
then logic programming or constraint programming[1] is the way to go.


Would you be kind enough to give me or point me towards a good example 
of such a case?  I've been trying to understand what the point of logic 
languages is for a while now, but whenever I look at them I have trouble 
seeing what exactly it is they offer that solves some problems 
better/more easily than functional languages.


Thanks!
Greg
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Is Curry alive?

2010-11-04 Thread Dan Doel
On Thursday 04 November 2010 5:13:23 pm Gregory Crosswhite wrote:
   On 11/02/2010 08:37 PM, wren ng thornton wrote:
  Indeed. If your program requires unification or constraint solving
  then logic programming or constraint programming[1] is the way to go.
 
 Would you be kind enough to give me or point me towards a good example
 of such a case?  I've been trying to understand what the point of logic
 languages is for a while now, but whenever I look at them I have trouble
 seeing what exactly it is they offer that solves some problems
 better/more easily than functional languages.

Implementing type inference can be very easy in a logic language, because most 
of the work in a non-logic language is implementing unification:

  http://muaddibspace.blogspot.com/2008/01/type-inference-for-simply-typed-
lambda.html

3 lines of Prolog to infer types for the simply typed lambda calculus with 
meta-variables (not counting the operator fixity declarations). I've written 
something similar in Haskell, and it's more like 80 lines of real code, a lot 
of which is implementing things that are simply part of Prolog's computation 
model.

-- Dan
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Is Curry alive?

2010-11-04 Thread Gregory Crosswhite

 On 11/04/2010 03:06 PM, Dan Doel wrote:

Implementing type inference can be very easy in a logic language, because most
of the work in a non-logic language is implementing unification:

   http://muaddibspace.blogspot.com/2008/01/type-inference-for-simply-typed-
lambda.html

3 lines of Prolog to infer types for the simply typed lambda calculus with
meta-variables (not counting the operator fixity declarations). I've written
something similar in Haskell, and it's more like 80 lines of real code, a lot
of which is implementing things that are simply part of Prolog's computation
model.

-- Dan



Cool!  Thank you very much;  that is exactly the kind of thing I was 
looking for.  :-)


Cheers,
Greg
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Is Curry alive?

2010-11-03 Thread wren ng thornton

On 11/3/10 12:34 AM, Gregory Crosswhite wrote:

On 11/2/10 8:37 PM, wren ng thornton wrote:

Though I would suggest you look at the LogicT library instead of using
actual lists... Also, you may be interested in reading the LogicT
paper[2] or this paper[3] about search combinators in Haskell. Both
offer a number of optimizations you should be aware of.


I considered it, but wasn't there discussion recently about how LogicT
is a lot slower than the list monad?


In my experience it's always been faster. Though, of course, it will 
depend on the shape of your programs. If you need things like fair 
interleaving, do note that it's quite hard to get right and LogicT has 
already done the work for you--- both for Logic/LogicT and [], so even 
if you do use lists you may still want to use the logict package.


Besides, it's simple enough to just use the MonadLogic class and switch 
between concrete types, if you need to test performance.


--
Live well,
~wren
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Is Curry alive?

2010-11-02 Thread Sebastian Fischer

Hi Gregory,

On Nov 2, 2010, at 9:27 AM, Gregory Crosswhite wrote:

I was thinking about using Curry, but it looks to me like the  
language is dead and hasn't seen much activity for a few years.


The community is smaller than the Haskell community but the PAKCS  
system is still actively developed. MCC is a compiler that generates C  
(rather than Prolog) code (which is often more efficient) but does not  
come with the same libraries as PAKCS.


Actually, the more that I think about my problem the more that I'm  
thinking I should just stick with the List monad.


If this is feasible then staying in Haskell might be a good choice.

Which does raise the question: when is it better to use a logic  
programming language instead of the list monad?


It is more cumbersome to model logic variables and unification in a  
pure functional language than having them provided natively. If you  
need unification or constraint solving then a language with built-in  
logic variables has an advantage.


An advantage of combining laziness with non-determinism as in Curry is  
that you can interleave the non-deterministic generation of nested  
data with categorizing it which is more efficient, if the evaluation  
function is lazy. This combination is more cumbersome in a pure  
functional language than in a lazy functional-logic language like  
Curry (see [1] for how to do it). If you always need to generate  
elements completely before you categorize them, you probably do not  
gain from lazy non-determinism.


Cheers,
Sebastian

 [1]: http://www-ps.informatik.uni-kiel.de/~sebf/data/pub/icfp09.pdf
  http://sebfisch.github.com/explicit-sharing/
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Is Curry alive?

2010-11-02 Thread wren ng thornton

On 11/2/10 7:09 AM, Sebastian Fischer wrote:

On Nov 2, 2010, at 9:27 AM, Gregory Crosswhite wrote:

Which does raise the question: when is it better to use a logic
programming language instead of the list monad?


It is more cumbersome to model logic variables and unification in a pure
functional language than having them provided natively. If you need
unification or constraint solving then a language with built-in logic
variables has an advantage.


Indeed. If your program requires unification or constraint solving then 
logic programming or constraint programming[1] is the way to go. 
However, if all you need is backtracking or search, then while 
logic/constraint languages may be helpful because they have optimized 
search combinators built in, but they don't offer anything particularly 
interesting over doing search in functional programming.


Though I would suggest you look at the LogicT library instead of using 
actual lists... Also, you may be interested in reading the LogicT 
paper[2] or this paper[3] about search combinators in Haskell. Both 
offer a number of optimizations you should be aware of.



[1] Or constraint-logic programming a la ECLiPSe:
http://eclipseclp.org/
[2] http://okmij.org/ftp/papers/LogicT.pdf
[3] http://web.cecs.pdx.edu/~apt/jfp01.ps

--
Live well,
~wren
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Is Curry alive?

2010-11-02 Thread Gregory Crosswhite

On 11/2/10 8:37 PM, wren ng thornton wrote:
Though I would suggest you look at the LogicT library instead of using 
actual lists... Also, you may be interested in reading the LogicT 
paper[2] or this paper[3] about search combinators in Haskell. Both 
offer a number of optimizations you should be aware of.


I considered it, but wasn't there discussion recently about how LogicT 
is a lot slower than the list monad?


Cheers,
Greg
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Is Curry alive?

2010-11-01 Thread Gregory Crosswhite

 Hey everyone,

This is a little off-topic, but I just ran into a problem which might 
benefit from being attacked by a logic language, so I've been looking 
for a good one to try out --- and hopefully one that has a very 
efficient implementation since I want to iterate through billions and 
possibly trillions of nondeterministically generated solutions.  I was 
thinking about using Curry, but it looks to me like the language is dead 
and hasn't seen much activity for a few years.  Does anyone know about 
whether there is still much going on over there?  Or, alternatively, do 
you have any suggestions regarding other logic language/implementations 
I should check out?  I've also been looking at Prolog but I am having 
trouble seeing whether I can process N non-deterministic solutions in 
O(1) space (rather than first generating a O(N) size list), and I 
checked out Mercury but the documentation for it is a bit sparse.


Thanks!
Greg
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Is Curry alive?

2010-11-01 Thread Richard O'Keefe

On 2/11/2010, at 1:27 PM, Gregory Crosswhite wrote:

 Hey everyone,
 
 This is a little off-topic, but I just ran into a problem which might benefit 
 from being attacked by a logic language,

Why not describe the problem?

 so I've been looking for a good one to try out --- and hopefully one that has 
 a very efficient implementation since I want to iterate through billions and 
 possibly trillions of nondeterministically generated solutions.

I think about the practical success of Model Checking, and wonder whether it
might be better NOT to iterate through so many.

 I've also been looking at Prolog but I am having trouble seeing whether I can 
 process N non-deterministic solutions in O(1) space (rather than first 
 generating a O(N) size list),

The point of backtracking search is that you need only space for the current
candidate solution, not for all solutions visited so far.  So much so that the
Iterative Deepening family of search algorithms cheerfully *revisit* graph nodes
in order to save time over all.

 and I checked out Mercury but the documentation for it is a bit sparse.
Packl
The Mercury documentation I downloaded in March comes to 773 pages.

faq.pdf   6 pages
library.pdf 470 pages
reference_manual.pdf150 pages
transition_guide.pdf 10 pages (Mercury for Prolog programmers)
user_guide.pdf  137 pages

Packing the tutorial into a single HTML file gives another 19 pages.
Ralph Beckett's tutorial adds another 53 pages of PDF.

So that's 845 pages all up.  sparse?



a bit sparse?___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Is Curry alive?

2010-11-01 Thread Gregory Crosswhite

 On 11/01/2010 06:19 PM, Richard O'Keefe wrote:

On 2/11/2010, at 1:27 PM, Gregory Crosswhite wrote:


Hey everyone,

This is a little off-topic, but I just ran into a problem which might benefit 
from being attacked by a logic language,

Why not describe the problem?



My goal is to exhaustively search through a space in order to categorize 
all of the elements in that space.  This space is combinatorial, so I am 
looking for a good domain specific language for letting me break up the 
space into a bunch of generators which can be combined combinatorically 
to generate the list of elements in the space.  What makes the 
generators non-trivial is that I am choosing them carefully in order to 
eliminate many symmetries of the problem that effectively result in 
equivalent/redundant elements of the search space, and a consequence of 
this is that some generators depend on the results of other generators.  
The combinatorics of the problem are such that there are about a billion 
elements in the space if I try tackling a small version of my problem, 
and a trillion if I try tackling a slightly larger version of my problem.


(More background:  The elements in this space are choices of quantum 
measurement operators which are used to implement a quantum 
error-correcting code, and I am interested in knowing if the code is 
good or not.  My goal is to systematically search through all codes 
with a small (= 5-6) number of qubits in order to be able to classify 
all of the possible of error-correcting codes one can implement using 
that many qubits.)


It is worth mentioning that the function I am applying to these elements 
to compute the desired properties is very fast.  I have had benchmarks 
showing the ability for this function to scan through up to ~ 500,000 
elements a second (It helps that it is written in C++ :-) ).


Actually, the more that I think about my problem the more that I'm 
thinking I should just stick with the List monad.  This gives me a way 
to create generators that can rely on the results of other generators 
and put them all together using the List monad, taking advantage of 
Haskell's laziness to iterate in O(1) space.  Which does raise the 
question: when is it better to use a logic programming language instead 
of the list monad?



so I've been looking for a good one to try out --- and hopefully one that has a 
very efficient implementation since I want to iterate through billions and 
possibly trillions of nondeterministically generated solutions.

I think about the practical success of Model Checking, and wonder whether it
might be better NOT to iterate through so many.



What exactly do you mean by Model Checking?

Anyway, there might be more clever ways to eliminate possibilities from 
my space (other than the ones I have already) but the advantage of 
having a computer search all of it is that it can scan through all of 
the millions and even billions of possibilities in a stupid brute-force 
fashion in less time than it takes for me to come up with a clever way 
to analyze the space using theoretical analysis.  :-)



I've also been looking at Prolog but I am having trouble seeing whether I can 
process N non-deterministic solutions in O(1) space (rather than first 
generating a O(N) size list),

The point of backtracking search is that you need only space for the current
candidate solution, not for all solutions visited so far.  So much so that the
Iterative Deepening family of search algorithms cheerfully *revisit* graph nodes
in order to save time over all.



Yes, exactly, but my point is that in Prolog I am having trouble 
figuring out if there is a way to iterate over all of the solutions 
generated non-deterministically in O(1) space because all of the library 
functions which run an accumulator over a generated set of results seem 
to operate by first generating the full list and then accumulating over 
it which takes O(n) space, which is wasteful since as you point out it 
should only take O(1) space to do this.  However, it is also possible 
that I misunderstand the semantics of how Prolog works and so things 
which look like O(n) to me are actually O(1) --- similar to laziness in 
Haskell.




and I checked out Mercury but the documentation for it is a bit sparse.

Packl
The Mercury documentation I downloaded in March comes to 773 pages.

faq.pdf   6 pages
library.pdf 470 pages
reference_manual.pdf150 pages
transition_guide.pdf 10 pages (Mercury for Prolog programmers)
user_guide.pdf  137 pages

Packing the tutorial into a single HTML file gives another 19 pages.
Ralph Beckett's tutorial adds another 53 pages of PDF.

So that's 845 pages all up.  sparse?



a bit sparse?


Yes, because although there is a tutorial with the trivial stuff, and a 
references for people who already know what they are doing, there is not 
much intermediate-level documentation for someone who understands the 
basic ideas behind the language