My only experience with JUnit was a bit of a disappointment.  JUnit is
designed for testing procedural code, not declarative statements.  The
JUnit tests that we ended up writing were hand-coded.  

I guess that the thing is that JUnit is part of the XP concept.  This
means that after you gather ALL of the requirements then you write ALL
of the tests - before writing the first line of code.  Then, if the code
doesn't pass the tests, it's bad.  The XP guys really like to use the
phrase "refactor mercilessly" as more of chant than fact.  

The best method that I've found do far is extremely simple.  I write a
spreadsheet with all of the applicable slots (attributes) in the
columns.  Then the name of each rule is the heading for the successive
columns.  Loading up the attribute columns is a simple binary problem.
Easily duplicated in about a day or two.  Then, with the help of the BA,
the rules are confirmed as TRUE (or YES or whatever) in the appropriate
place.  This is exported to a CSV file and each row is pulled into a set
of Java objects.  The rules are run with the results checked against
whether the rule should have fired or not.  The result is total
verification.  Validation is another matter entirely.  

I know it sounds complicated, but it's actually quite easy.  Also, it
gives you complete alpha and beta error tests (for those with a
statistical background.)  Think of a simple AND situation

IF A and B then C
Testing for 
A = 0, B = 0, C = blank
A = 0, B = 1, C = blank
A = 1, B = 0, C = blank
A = 1, B = 1, C = TRUE

Four rows, total checking for the rule.  Now, when I set this up for the
driver validation for a major insurance company, we had 5,120 checks.
When we did it for a major banking loan application, we had 65,000+
checks.  On the other hand, the tests for the driver validation were
written in about two days.  The banking application took a bit over a
week.  The bottom line is that you CANNOT leave a gap in something like
loan applications or insurance underwriting or some flake will find it
and exploit it.  100% testing is absolutely necessary.

In the past, the wisdom has been that you can't check every possibility.
The truth is that you CAN check every possibility.  With today's
computers having enormous RAM and disk storage, this is not a problem.
The only trick is to compartmentalize the rule sets so that you have
manageable testing runs.  The driver acceptability tests took a few
minutes.  The banking application tool about four hours.  Both were
running on an Sun E6000 with 8 processors. 

Remember - there is total peace of mind for the business users.  ALSO,
nobody (including the BA's) have to learn another testing tool.  JUnit
was used in both cases to test the Java classes themselves.

Good luck.  The first time you write one of these you'll have to iron
out a few wrinkles.  The second time is a snap.  The third time is,
"So?"  

SDG
jco

James C. Owen
Senior Knowledgebase Consultant
6314 Kelly Circle
Garland, TX   75044
972.530.2895 
214.684.5272 (cell)


-----Original Message-----
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
On Behalf Of [EMAIL PROTECTED]
Sent: Tuesday, June 03, 2003 6:23 PM
To: [EMAIL PROTECTED]
Subject: Re: JESS: more efficient jess debugging

I think Jack Kerkhof wrote:
> All this discussion of an improved parser, debugging, and IDE are
whetting my
> appetite!
> 
> Are there any specifications yet regarding what will be included in a
'version
> 7'? And any estimates of timelines?

Jess has always been every customer driven, so the answer to the first
question depends on what people want to see. I've got lots of
anecdotal information, but not any hard numbers on what people think
would be most important. 

I've been meaning to set up a survey to help set priorities. Anybody
know of some good, no-hassle web-based survey software? The stuff I've
been able to find has been more than worthless.

> 
> If parser work is going to be done what exactly what
improvements/features are
> being entertained?

Well, the meat of it would be that the current ad-hoc parser would be
replaced by something using a formal grammar and an intermediate AST
representation, so that introducing entirely new rule languages
becomes much simpler, and there will be a way to add user-specified
extensions. Then there will be an XML front end as an alternative to
the Jess language one.

> 
> Alan has clearly been thinking a lot about debugging. Testing and
Validation
> are also big issues for commercial applications. Any thoughts on
automation in
> that area?

I've thought some about what a "JessUnit" would look like. I'd love to
start a discussion about that. Anyone with XUnit experience who has
ideas? 

> 
> Jack
> 
> 
> 

--------------------------------------------------------------------
To unsubscribe, send the words 'unsubscribe jess-users [EMAIL PROTECTED]'
in the BODY of a message to [EMAIL PROTECTED], NOT to the list
(use your own address!) List problems? Notify [EMAIL PROTECTED]
--------------------------------------------------------------------

Reply via email to