Working memories of the order of a 100,000 facts are actually pretty
routine. Any quadratic behavior will kill you, so you have to be
careful to write the rules properly. But a few hundred thousand
facts is not a problem at all.
Jess doesn't spool working memory to disk; it lives in RAM. There are
techniques for telling Jess how to only load the facts it actually
needs, however.
On Nov 27, 2007, at 12:01 PM, Joe Wass wrote:
I'm considering Jess for a software project that reasons about
large-ish data sets of small objects. Perhaps people could comment
on whether they think JESS is a suitable engine.
From what I know of CLIPS, the rule based system would be a very
good way to transform the data sets in terms of the scheme of the
data and the processes I'm undertaking. Until I run the software I
don't know the exact amount of data that will be produced (that's
half the point of the exercise!) but I estimate that it'll have of
the order of a hundred thousand (upwards) small facts and a small
number of rules. I'm not carrying out any massively complicated
inference and the combinatorics of the rules themselves aren't
explosive. But the size of the data may be prohibitive?
So my question is, what's the ball-park figure for the amount of
data that JESS can store? Does it cache to disc? My data-set will
possibly fit in memory all at once, but I wouldn't want to rely on
that.
Thanks!
Joe
---------------------------------------------------------
Ernest Friedman-Hill
Informatics & Decision Sciences Phone: (925) 294-2154
Sandia National Labs FAX: (925) 294-2234
PO Box 969, MS 9012 [EMAIL PROTECTED]
Livermore, CA 94550 http://www.jessrules.com
--------------------------------------------------------------------
To unsubscribe, send the words 'unsubscribe jess-users [EMAIL PROTECTED]'
in the BODY of a message to [EMAIL PROTECTED], NOT to the list
(use your own address!) List problems? Notify [EMAIL PROTECTED]
--------------------------------------------------------------------