How I learned to stop worrying and love the meta-rule.

Long post warning, if you're not interested in my possibly rambling
nature. I do have a couple questions regarding common patterns in Jess
at the end, which I would appreciate a little feedback on if you have a
few minutes.

Regarding my recent post about performance, I was able to resolve it
this morning solely by making changes to the rules. How? By making more
rules and have them rely on constants where possible instead of
variables.

Like probably many people, I came to Jess & rules systems after many
years (15 - egad!) of object-oriented programming over relational
databases. That way of thinking for so long can sure program the mind to
work a certain way, and it can take some time to deprogram. Anyway, in
the slow version of my system, I had 3 core rules causing bottlenecks
and they took this general form (a simplification, and they also seek
min/max values of some values, but it gets the idea across):

(defrule do-it
 (aaa (a ?a) (b ?b) (c ?c) (d ?d) (e ?e) (f ?f))
 (bbb (a ?a) (b ?b) (c ?c) (d ?d) (g ?g) (h ?h) (i ?i))
 (ccc (a ?a) (b ?b) (c ?c) (d ?d) (j ?j))
=>
 ;do something meaningful
)

The thing about this, is that the values for ?a ?b ?c ?d are taken from
lookup tables in a database, and vary for each request into the system
depending on user actions in a client application (each service request
asserts a few facts, causing rules to fire which possibly update some
shadow-facts, and then collects the updated state from the underlying
java beans to return to the user). There are only a few possible values
for each of these variables, but combined with the facts that contain
them, I was getting an enormous number of partial matches. And I do mean
a lot. Profiling showed that with only about 5000 facts in the system, a
single service request was causing 10+ million value comparisons to be
made. Ouch.

I've had this problem in the back of my mind for several weeks, and
investigated some other avenues such as the profiling I mentioned,
without really getting anywhere. This morning I finally saw the light:
"static queries on dynamic data" - of course! The lookup values are
really static data, and shouldn't be treated like query parameters. What
I had been doing was using these variables to qualify the LHS much the
same way a sql query would work. But, that goes against "sql: dynamic
queries on static data, jess: static queries on dynamic data".

Then I started thinking about turning those variables into constants and
having a separate rule for every permutation and if that would reduce
all those partial matches and millions of comparisons. Would it be hard
to write a meta-rule to write these rules? Turns out it was ridiculously
easy.

I already had some backchaining going on, that was fetching in some
other db data the first time a given combination of ?a ?b ?c ?d was
encountered (to assert aaa & bbb facts, among a few other things). I
added another backchained fact to keep track of which permutations had
yet to have their rules written, and ended up with something like this:

(defrule get-do-it-rules
 (need-do-it-rules (id ?) (a ?a) (b ?b) (c ?c) (d ?d))
 =>
 (bind ?cmd (str-cat
  " (defrule do-it-" ?a "-" ?b "-" ?c "-" ?d
  "  (aaa (a "?a") (b "?b") (c "?c") (d "?d") (e ?e) (f ?f))"
  "  (bbb (a "?a") (b "?b") (c "?c") (d "?d") (g ?g) (h ?h) (i ?i))"
  "  (ccc (a "?a") (b "?b") (c "?c") (d "?d") (j ?j) )"
  " =>"
  "  ;do something meaningful"
  " )"))
 (build ?cmd)
 (assert (do-it-rules (id gensym*) (a ?a) (b ?b) (c ?c) (d ?d)) )

resulting in rules that look like:

(defrule do-it-400-600-1-300
 (aaa (a 400) (b 600) (c 1) (d 300) (e ?e) (f ?f))
 (bbb (a 400) (b 600) (c 1) (d 300) (g ?g) (h ?h) (i ?i))
 (ccc (a 400) (b 600) (c 1) (d 300) (j ?j) )
=>
  ;do something meaningful
 )

What happened was a big wow - I went from 3 core rules to over 200, and
a run of the system went from 20+ seconds to < 1 second. This is a huge
win, considering this is a service that gets hit by our client
application while the people that use it are on the phone with
customers.

While I could have done this outside of jess by writing java code to
generate a batch file containing the rule permutations, it was more
natural in this case to just do it within jess. Overall though, is
having many rules like this a common pattern? Is there a better route to
go that would accomplish the same thing without generating hundreds of
rules? Just wondering if I'm on the right track here or way out in left
field.

thanks a bunch,

dave


--------------------------------------------------------------------
To unsubscribe, send the words 'unsubscribe jess-users [EMAIL PROTECTED]'
in the BODY of a message to [EMAIL PROTECTED], NOT to the list
(use your own address!) List problems? Notify [EMAIL PROTECTED]
--------------------------------------------------------------------

Reply via email to