The uploaded file

     AI-DecisionTree-0.03.tar.gz

has entered CPAN as

   file: $CPAN/authors/id/K/KW/KWILLIAMS/AI-DecisionTree-0.03.tar.gz
   size: 21535 bytes
    md5: c3b1f1675c1c5ac2e8f41dcfc09d1082


Changes since 0.02:

0.03  Mon Sep  2 11:41:18 AEST 2002

  - Added a 'prune' parameter to new(), which controls whether the tree
    will be pruned after training.  This is usually a good idea, so the
    default is to prune.  Currently we prune using a simple
    minimum-description-length criterion.

  - Training instances are now represented using a C struct rather than
    a Perl hash.  This can dramatically reduce memory usage, though it
    doesn't have much effect on speed.  Note that Inline.pm is now
    required.

  - The list of instances is now deleted after training, since it's no
    longer needed.

  - Small speedup to the train() method, achieved by less copying 
of data.

  - If get_result() is called in a list context, it now returns a list
    containing the assigned result, a "confidence" score (tentative,
    subject to change), and the tree depth of the leaf this instance
    ended up at.

  - Internally, each node in the tree now contains information about
    how many training examples contributed to training this node, and
    what the distribution of their classes was.

  - Added an as_graphviz() method, which will help visualize trees.
    They're not terribly pretty graphviz objects yet, but they're
    visual.


  -Ken

Reply via email to