RE: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess
Hi Rich , Sort of. :-D If you look at the article in the link, you'll see how the researchers approached the problem. Basically, I would like to start a Jess application (that follows the Tax Advisor pattern, but isn't a Tax Advisor!) by allowing the users to enter a free-text problem statement -- like when you tell your doctor where it hurts. The doctor can then begin to make inferences about what type of problem you may have by parsing your input and pattern-matching it to syntactically similar, pre-parsed phrases that share the distilled semantics of the original input (if that makes sense), and then ask more leading questions to heuristically home-in on the solution. As an example, in a typical BNF production, I might have a definition problem_statement::= subjectverbend-mark so that a problems_statement is composed of a the non-terminals subjectverbend-mark in that order. And I might have a vocabulary like subject - I | You | We verb - ran | jumped | cried end-mark - . | ? | ! For all the possible combinations of these non-terminals and terminals (all productions), I'd have to construct a rule to deal with that production. If I understand the article right, what they did was to map the set of all the synonyms of each of the non-terminals to a key, and after doing this they composed phrases of these keys to store the generic semantics of the input, thereby collapsing the number of patterns for which they need to store a meaning. I just thought that it was a novel approach instead of parsing the string by brute force and trying to process the results with a gazillion rules. Hope that clarifies a bit. Regards, Jason Morris --- Morris Technical Solutions [EMAIL PROTECTED] www.morristechnicalsolutions.com fax/phone: 503.692.1088 -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Behalf Of Rich Halsey Sent: Thursday, February 05, 2004 4:06 AM To: [EMAIL PROTECTED] Subject: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess Hi Jason, In trying to reduce the description of your problem, I came up with the following: Use a Natural Language front-end for the user to interact with a rule-based Tax Advisor where the rules derive a solution to a query based on data derived from a free form input. Does this sound even close to what you want to do ?? Rich Halsey - Original Message - From: Jason Morris [EMAIL PROTECTED] To: Jess-Users [EMAIL PROTECTED] Sent: Wednesday, February 04, 2004 7:07 PM Subject: JESS: Restricted Language Query/ Natural Language Parsing in Jess Hi All, Sorry for the long post, but this is an esoteric question... I am interested in adapting the Tax Form Advisor (using it almost like a OO design-pattern) by adding a component that can reason about information drawn from natural-language input as well as using restricted answers to hard-coded questions. To make the parsing problem more tractable, I began thinking of different ways that I could derive meaning from various input strings without coding a huge parsing engine from scratch or writing hundreds of extra rules. I read a lot of parsing theory and experimented with various BNF syntaxes, but quickly ran into trouble as the language grew and the rules became more complex. Since my background is in mechanical engineering, I tried to draw parallels with what I already know. In fluid mechanics, there is the theory of non-dimensional parameters whereby a complex functional equation in m variables and n dimensions can be reduced to (m-n) dimensionless parameters, which should be theoretically easier to manipulate. I reasoned: why couldn't I attempt to do the same thing with words -- in other words, treat the input string as function of tokens having a certain dimension or membership in semantic subsets, and then attempt to normalize the string to fit a stored semantic pattern that would have meaning to Jess. Theoretically, this would significantly cut down the number of rules that I'd have to write to handle various inputs, even ambiguous ones, while letting the user type away to describe the initial problem input. Alas, it seems that my idea was anticipated (see pg.2): http://www.amia.org/pubs/symposia/D005310.PDF However, does anyone have any good suggestions as to how to implement this approach in Jess? Thanks! Jason Morris Morris Technical Solutions [EMAIL PROTECTED] www.morristechnicalsolutions.com fax/phone: 503.692.1088 To unsubscribe, send the words 'unsubscribe jess-users [EMAIL PROTECTED]' in the BODY of a message to [EMAIL PROTECTED], NOT to the list (use your own address!) List problems? Notify [EMAIL PROTECTED] To unsubscribe, send the words 'unsubscribe jess-users [EMAIL PROTECTED
RE: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess
Jason, Rich and Ernest: Actually, quite a bit of work has been done in this area. It followed shortly after all of the speech-pattern-recognition stuff started. A fellow named Sankar K. Pal started a program named MyPal wherein he would be able to retrieve sense from nonsense typed in from the keyboard. He gave a presentation way back in 1989 at UT Dallas in one of the M.I.N.D. conferences co-hosted by UT Arlington. Dr. Daniel S. Levine and Dr. Alice O'Toole from UTA were the moderators. They had top name guys from all over the world at the conference. [Gail Carpenter and Steve Grossberg were the top two names there but the US Naval Surface Warfare Depart was also well represented.] Dr. Levine is now in the Department of Psychology at UTA because that was the only department willing to fund his research. Anyway, Dr. Pal co-authored a book with Paul P. Wang. Amazon link is http://www.amazon.com/exec/obidos/ASIN/0849394678/inktomi-bkasin-20/ref% 3Dnosim/102-1084313-6504134 I found another book at (of all places) WalMart.com on Pattern Recognition software. http://www.walmart.com/catalog/product.gsp?product_id=1072257sourceid=1 500040820 Some earlier works by Sankar are available from the Indian Statistical Institute in Calcutta. http://www.wspc.com/books/compsci/4755.htm but, for some reason, this one is cheaper. Go figure... I guess that a Microsoft like costs more to put up than a Unix link. :-) http://www.wspc.com/books/compsci/4755.html Finally, if you act now, you can get one for only $9.95 (or so) on EBay http://half.ebay.com/cat/buy/prod.cgi?cpid=805831domain_id=1856ad=5398 3 enjoy. SDG jco James C. Owen Knowledgebased Systems Corporation Senior Consultant -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Jason Morris Sent: Thursday, February 05, 2004 10:44 AM To: [EMAIL PROTECTED] Subject: RE: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess Hi Rich , Sort of. :-D If you look at the article in the link, you'll see how the researchers approached the problem. Basically, I would like to start a Jess application (that follows the Tax Advisor pattern, but isn't a Tax Advisor!) by allowing the users to enter a free-text problem statement -- like when you tell your doctor where it hurts. The doctor can then begin to make inferences about what type of problem you may have by parsing your input and pattern-matching it to syntactically similar, pre-parsed phrases that share the distilled semantics of the original input (if that makes sense), and then ask more leading questions to heuristically home-in on the solution. As an example, in a typical BNF production, I might have a definition problem_statement::= subjectverbend-mark so that a problems_statement is composed of a the non-terminals subjectverbend-mark in that order. And I might have a vocabulary like subject - I | You | We verb - ran | jumped | cried end-mark - . | ? | ! For all the possible combinations of these non-terminals and terminals (all productions), I'd have to construct a rule to deal with that production. If I understand the article right, what they did was to map the set of all the synonyms of each of the non-terminals to a key, and after doing this they composed phrases of these keys to store the generic semantics of the input, thereby collapsing the number of patterns for which they need to store a meaning. I just thought that it was a novel approach instead of parsing the string by brute force and trying to process the results with a gazillion rules. Hope that clarifies a bit. Regards, Jason Morris --- Morris Technical Solutions [EMAIL PROTECTED] www.morristechnicalsolutions.com fax/phone: 503.692.1088 -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Behalf Of Rich Halsey Sent: Thursday, February 05, 2004 4:06 AM To: [EMAIL PROTECTED] Subject: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess Hi Jason, In trying to reduce the description of your problem, I came up with the following: Use a Natural Language front-end for the user to interact with a rule-based Tax Advisor where the rules derive a solution to a query based on data derived from a free form input. Does this sound even close to what you want to do ?? Rich Halsey - Original Message - From: Jason Morris [EMAIL PROTECTED] To: Jess-Users [EMAIL PROTECTED] Sent: Wednesday, February 04, 2004 7:07 PM Subject: JESS: Restricted Language Query/ Natural Language Parsing in Jess Hi All, Sorry for the long post, but this is an esoteric question... I am interested in adapting the Tax Form Advisor (using it almost like a OO design-pattern) by adding a component that can reason about information drawn from natural-language input as well as using restricted answers to hard-coded questions. To make the parsing problem more tractable, I began thinking of different ways that I could derive meaning
RE: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess
James, Thank you for all the good links! I figured that there was a lot more out there, and I feared that I wasn't making myself clear. Regards, Jason Morris - Morris Technical Solutions [EMAIL PROTECTED] www.morristechnicalsolutions.com fax/phone: 503.692.1088 -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Behalf Of James Owen Sent: Thursday, February 05, 2004 11:25 AM To: [EMAIL PROTECTED] Subject: RE: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess Jason, Rich and Ernest: Actually, quite a bit of work has been done in this area. It followed shortly after all of the speech-pattern-recognition stuff started. A fellow named Sankar K. Pal started a program named MyPal wherein he would be able to retrieve sense from nonsense typed in from the keyboard. He gave a presentation way back in 1989 at UT Dallas in one of the M.I.N.D. conferences co-hosted by UT Arlington. Dr. Daniel S. Levine and Dr. Alice O'Toole from UTA were the moderators. They had top name guys from all over the world at the conference. [Gail Carpenter and Steve Grossberg were the top two names there but the US Naval Surface Warfare Depart was also well represented.] Dr. Levine is now in the Department of Psychology at UTA because that was the only department willing to fund his research. Anyway, Dr. Pal co-authored a book with Paul P. Wang. Amazon link is http://www.amazon.com/exec/obidos/ASIN/0849394678/inktomi-bkasin-20/ref% 3Dnosim/102-1084313-6504134 I found another book at (of all places) WalMart.com on Pattern Recognition software. http://www.walmart.com/catalog/product.gsp?product_id=1072257sourceid=1 500040820 Some earlier works by Sankar are available from the Indian Statistical Institute in Calcutta. http://www.wspc.com/books/compsci/4755.htm but, for some reason, this one is cheaper. Go figure... I guess that a Microsoft like costs more to put up than a Unix link. :-) http://www.wspc.com/books/compsci/4755.html Finally, if you act now, you can get one for only $9.95 (or so) on EBay http://half.ebay.com/cat/buy/prod.cgi?cpid=805831domain_id=1856ad=5398 3 enjoy. SDG jco James C. Owen Knowledgebased Systems Corporation Senior Consultant -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Jason Morris Sent: Thursday, February 05, 2004 10:44 AM To: [EMAIL PROTECTED] Subject: RE: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess Hi Rich , Sort of. :-D If you look at the article in the link, you'll see how the researchers approached the problem. Basically, I would like to start a Jess application (that follows the Tax Advisor pattern, but isn't a Tax Advisor!) by allowing the users to enter a free-text problem statement -- like when you tell your doctor where it hurts. The doctor can then begin to make inferences about what type of problem you may have by parsing your input and pattern-matching it to syntactically similar, pre-parsed phrases that share the distilled semantics of the original input (if that makes sense), and then ask more leading questions to heuristically home-in on the solution. As an example, in a typical BNF production, I might have a definition problem_statement::= subjectverbend-mark so that a problems_statement is composed of a the non-terminals subjectverbend-mark in that order. And I might have a vocabulary like subject - I | You | We verb - ran | jumped | cried end-mark - . | ? | ! For all the possible combinations of these non-terminals and terminals (all productions), I'd have to construct a rule to deal with that production. If I understand the article right, what they did was to map the set of all the synonyms of each of the non-terminals to a key, and after doing this they composed phrases of these keys to store the generic semantics of the input, thereby collapsing the number of patterns for which they need to store a meaning. I just thought that it was a novel approach instead of parsing the string by brute force and trying to process the results with a gazillion rules. Hope that clarifies a bit. Regards, Jason Morris --- Morris Technical Solutions [EMAIL PROTECTED] www.morristechnicalsolutions.com fax/phone: 503.692.1088 -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Behalf Of Rich Halsey Sent: Thursday, February 05, 2004 4:06 AM To: [EMAIL PROTECTED] Subject: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess Hi Jason, In trying to reduce the description of your problem, I came up with the following: Use a Natural Language front-end for the user to interact with a rule-based Tax Advisor where the rules derive a solution to a query based on data derived from a free form input. Does this sound even close to what you want to do ?? Rich Halsey - Original Message - From: Jason Morris [EMAIL PROTECTED] To: Jess-Users [EMAIL PROTECTED] Sent: Wednesday, February 04
Re: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess
Hey James, Have you ever seen any market ($) for this kind of work out there ?? I would think that the rule-based side of it could be very interesting. Rich Halsey - Original Message - From: James Owen [EMAIL PROTECTED] To: [EMAIL PROTECTED] Sent: Thursday, February 05, 2004 1:24 PM Subject: RE: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess Jason, Rich and Ernest: Actually, quite a bit of work has been done in this area. It followed shortly after all of the speech-pattern-recognition stuff started. A fellow named Sankar K. Pal started a program named MyPal wherein he would be able to retrieve sense from nonsense typed in from the keyboard. He gave a presentation way back in 1989 at UT Dallas in one of the M.I.N.D. conferences co-hosted by UT Arlington. Dr. Daniel S. Levine and Dr. Alice O'Toole from UTA were the moderators. They had top name guys from all over the world at the conference. [Gail Carpenter and Steve Grossberg were the top two names there but the US Naval Surface Warfare Depart was also well represented.] Dr. Levine is now in the Department of Psychology at UTA because that was the only department willing to fund his research. Anyway, Dr. Pal co-authored a book with Paul P. Wang. Amazon link is http://www.amazon.com/exec/obidos/ASIN/0849394678/inktomi-bkasin-20/ref% 3Dnosim/102-1084313-6504134 I found another book at (of all places) WalMart.com on Pattern Recognition software. http://www.walmart.com/catalog/product.gsp?product_id=1072257sourceid=1 500040820 Some earlier works by Sankar are available from the Indian Statistical Institute in Calcutta. http://www.wspc.com/books/compsci/4755.htm but, for some reason, this one is cheaper. Go figure... I guess that a Microsoft like costs more to put up than a Unix link. :-) http://www.wspc.com/books/compsci/4755.html Finally, if you act now, you can get one for only $9.95 (or so) on EBay http://half.ebay.com/cat/buy/prod.cgi?cpid=805831domain_id=1856ad=5398 3 enjoy. SDG jco James C. Owen Knowledgebased Systems Corporation Senior Consultant -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Jason Morris Sent: Thursday, February 05, 2004 10:44 AM To: [EMAIL PROTECTED] Subject: RE: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess Hi Rich , Sort of. :-D If you look at the article in the link, you'll see how the researchers approached the problem. Basically, I would like to start a Jess application (that follows the Tax Advisor pattern, but isn't a Tax Advisor!) by allowing the users to enter a free-text problem statement -- like when you tell your doctor where it hurts. The doctor can then begin to make inferences about what type of problem you may have by parsing your input and pattern-matching it to syntactically similar, pre-parsed phrases that share the distilled semantics of the original input (if that makes sense), and then ask more leading questions to heuristically home-in on the solution. As an example, in a typical BNF production, I might have a definition problem_statement::= subjectverbend-mark so that a problems_statement is composed of a the non-terminals subjectverbend-mark in that order. And I might have a vocabulary like subject - I | You | We verb - ran | jumped | cried end-mark - . | ? | ! For all the possible combinations of these non-terminals and terminals (all productions), I'd have to construct a rule to deal with that production. If I understand the article right, what they did was to map the set of all the synonyms of each of the non-terminals to a key, and after doing this they composed phrases of these keys to store the generic semantics of the input, thereby collapsing the number of patterns for which they need to store a meaning. I just thought that it was a novel approach instead of parsing the string by brute force and trying to process the results with a gazillion rules. Hope that clarifies a bit. Regards, Jason Morris --- Morris Technical Solutions [EMAIL PROTECTED] www.morristechnicalsolutions.com fax/phone: 503.692.1088 -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Behalf Of Rich Halsey Sent: Thursday, February 05, 2004 4:06 AM To: [EMAIL PROTECTED] Subject: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess Hi Jason, In trying to reduce the description of your problem, I came up with the following: Use a Natural Language front-end for the user to interact with a rule-based Tax Advisor where the rules derive a solution to a query based on data derived from a free form input. Does this sound even close to what you want to do ?? Rich Halsey - Original Message - From: Jason Morris [EMAIL PROTECTED] To: Jess-Users [EMAIL PROTECTED] Sent: Wednesday, February 04, 2004 7:07 PM Subject: JESS
Re: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess
Hi Jason, I finally got access to the pdf file that you listed (I had a problem getting there this morning). From what I see (in the paper from a very quick read), it would appear that your Natural Language Parser (NLP) would have to determine certain essential pieces of information in order to determine how to match this information to (JESS) rules and then package this up as parameters for the rules to reason over. This suggests that the problem comes in two pieces: (1) trying to determine (through parsing) what subject area (theme) the user is interested in and what is the relevant information for this area of concern, and (2) building a rule based system that corresponds to reasoning in this theme. As I mentioned above, one of the challenges would be to deal with the Presentation Service saying I have a question and the Inferencing Service saying I have an answer, let's see if they match. I come across this frequently with clients wanting to build Rule Synthesis (or Authoring) systems. My approach is to use parameterized rules in the Inferencing Service and send the parameters as a bundle from the Presentation Service in such a way that the rules can match on some Parameter Object that fits the proper rule. It is kind of abstract when it is first seen, but it really isn't that difficult to do. I think the biggest challenge is in the NLP area where the machine needs to learn how to interpet the input. I am a YACC/LEX afficionado and I have even dabbled with Object-oriented, rule-based parsing but I have never had occasion to do NLP work (except for voice-activated systems which really are grammar based). If I had to do NLP work, I would probably lean towards YACC/LEX, rule-based parsers (which opens the AI learning door). Sounds very interesting though - Good Luck Rich Halsey - Original Message - From: Jason Morris [EMAIL PROTECTED] To: [EMAIL PROTECTED] Sent: Thursday, February 05, 2004 10:44 AM Subject: RE: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess Hi Rich , Sort of. :-D If you look at the article in the link, you'll see how the researchers approached the problem. Basically, I would like to start a Jess application (that follows the Tax Advisor pattern, but isn't a Tax Advisor!) by allowing the users to enter a free-text problem statement -- like when you tell your doctor where it hurts. The doctor can then begin to make inferences about what type of problem you may have by parsing your input and pattern-matching it to syntactically similar, pre-parsed phrases that share the distilled semantics of the original input (if that makes sense), and then ask more leading questions to heuristically home-in on the solution. As an example, in a typical BNF production, I might have a definition problem_statement::= subjectverbend-mark so that a problems_statement is composed of a the non-terminals subjectverbend-mark in that order. And I might have a vocabulary like subject - I | You | We verb - ran | jumped | cried end-mark - . | ? | ! For all the possible combinations of these non-terminals and terminals (all productions), I'd have to construct a rule to deal with that production. If I understand the article right, what they did was to map the set of all the synonyms of each of the non-terminals to a key, and after doing this they composed phrases of these keys to store the generic semantics of the input, thereby collapsing the number of patterns for which they need to store a meaning. I just thought that it was a novel approach instead of parsing the string by brute force and trying to process the results with a gazillion rules. Hope that clarifies a bit. Regards, Jason Morris --- Morris Technical Solutions [EMAIL PROTECTED] www.morristechnicalsolutions.com fax/phone: 503.692.1088 -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Behalf Of Rich Halsey Sent: Thursday, February 05, 2004 4:06 AM To: [EMAIL PROTECTED] Subject: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess Hi Jason, In trying to reduce the description of your problem, I came up with the following: Use a Natural Language front-end for the user to interact with a rule-based Tax Advisor where the rules derive a solution to a query based on data derived from a free form input. Does this sound even close to what you want to do ?? Rich Halsey - Original Message - From: Jason Morris [EMAIL PROTECTED] To: Jess-Users [EMAIL PROTECTED] Sent: Wednesday, February 04, 2004 7:07 PM Subject: JESS: Restricted Language Query/ Natural Language Parsing in Jess Hi All, Sorry for the long post, but this is an esoteric question... I am interested in adapting the Tax Form Advisor (using it almost like a OO design-pattern) by adding a component that can reason about information drawn from natural-language input as well as using restricted answers to hard-coded
RE: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess
Jason: I think that what you might want to do is link the ANN parser, or a GA parser if you like, with the rules so that whatever was typed would make sense to the rules in the format that they were expecting. i.e., the parser would do the listening and the rules would do the thinking. :-) SDG jco James C. Owen Knowledgebased Systems Corporation Senior Consultant -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Jason Morris Sent: Thursday, February 05, 2004 1:58 PM To: [EMAIL PROTECTED] Subject: RE: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess James, Thank you for all the good links! I figured that there was a lot more out there, and I feared that I wasn't making myself clear. Regards, Jason Morris - Morris Technical Solutions [EMAIL PROTECTED] www.morristechnicalsolutions.com fax/phone: 503.692.1088 -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Behalf Of James Owen Sent: Thursday, February 05, 2004 11:25 AM To: [EMAIL PROTECTED] Subject: RE: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess Jason, Rich and Ernest: Actually, quite a bit of work has been done in this area. It followed shortly after all of the speech-pattern-recognition stuff started. A fellow named Sankar K. Pal started a program named MyPal wherein he would be able to retrieve sense from nonsense typed in from the keyboard. He gave a presentation way back in 1989 at UT Dallas in one of the M.I.N.D. conferences co-hosted by UT Arlington. Dr. Daniel S. Levine and Dr. Alice O'Toole from UTA were the moderators. They had top name guys from all over the world at the conference. [Gail Carpenter and Steve Grossberg were the top two names there but the US Naval Surface Warfare Depart was also well represented.] Dr. Levine is now in the Department of Psychology at UTA because that was the only department willing to fund his research. Anyway, Dr. Pal co-authored a book with Paul P. Wang. Amazon link is http://www.amazon.com/exec/obidos/ASIN/0849394678/inktomi-bkasin-20/ref% 3Dnosim/102-1084313-6504134 I found another book at (of all places) WalMart.com on Pattern Recognition software. http://www.walmart.com/catalog/product.gsp?product_id=1072257sourceid=1 500040820 Some earlier works by Sankar are available from the Indian Statistical Institute in Calcutta. http://www.wspc.com/books/compsci/4755.htm but, for some reason, this one is cheaper. Go figure... I guess that a Microsoft like costs more to put up than a Unix link. :-) http://www.wspc.com/books/compsci/4755.html Finally, if you act now, you can get one for only $9.95 (or so) on EBay http://half.ebay.com/cat/buy/prod.cgi?cpid=805831domain_id=1856ad=5398 3 enjoy. SDG jco James C. Owen Knowledgebased Systems Corporation Senior Consultant -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Jason Morris Sent: Thursday, February 05, 2004 10:44 AM To: [EMAIL PROTECTED] Subject: RE: JESS: Re: Restricted Language Query/ Natural Language Parsing in Jess Hi Rich , Sort of. :-D If you look at the article in the link, you'll see how the researchers approached the problem. Basically, I would like to start a Jess application (that follows the Tax Advisor pattern, but isn't a Tax Advisor!) by allowing the users to enter a free-text problem statement -- like when you tell your doctor where it hurts. The doctor can then begin to make inferences about what type of problem you may have by parsing your input and pattern-matching it to syntactically similar, pre-parsed phrases that share the distilled semantics of the original input (if that makes sense), and then ask more leading questions to heuristically home-in on the solution. As an example, in a typical BNF production, I might have a definition problem_statement::= subjectverbend-mark so that a problems_statement is composed of a the non-terminals subjectverbend-mark in that order. And I might have a vocabulary like subject - I | You | We verb - ran | jumped | cried end-mark - . | ? | ! For all the possible combinations of these non-terminals and terminals (all productions), I'd have to construct a rule to deal with that production. If I understand the article right, what they did was to map the set of all the synonyms of each of the non-terminals to a key, and after doing this they composed phrases of these keys to store the generic semantics of the input, thereby collapsing the number of patterns for which they need to store a meaning. I just thought that it was a novel approach instead of parsing the string by brute force and trying to process the results with a gazillion rules. Hope that clarifies a bit. Regards, Jason Morris --- Morris Technical Solutions [EMAIL PROTECTED] www.morristechnicalsolutions.com fax/phone: 503.692.1088 -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Behalf Of Rich