Glen wrote:
> And if you tell it that
> there are only, say, 10 possible answers, it will _merely_ produce one
> of those prescribed 10 possible answers.  
>   
You could say that about an employee, too, but that doesn't give much 
insight into what that person might actually be able to do.
> (I live for the day when I ask
> a computer:  "Is this true or false?"  And it answers:  "Neither, it's
> _blue_!"  ;-)
Computers typically don't do that, except in paraphrasing/concept 
extraction expert systems (e.g. Cyc), because people don't typically 
want them to do that.  For example, it's clear when this Java program is 
compiled that the compiler knows what `color' really is.  

enum Color { Blue, Red }

public class Test {

  static void main (String args[] ) {

    Color color = Color.Blue;

    if (color == true) {
      System.out.println ("true!");
    }
  }
}

One easy way to let that go is to switch to a dynamically typed 
language, where logical inconsistencies are dealt with in a case by case 
basis by the programmer.  (Presumably until the programmer can `see' how 
things should fit together.)

As far as detecting (supposedly) ill-posed questions goes, if you are 
willing to put aside the complex matter of natural language processing, 
it seems to me it's a matter of similarity search against a set 
propositions, and then engaging in a dialog of generalization and 
precisification with the user to identify an unambiguous and agreeable 
form for the question that has appropriate answers.  

Marcus


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

Reply via email to