Dear all,

We are delighted to announce our recently released ITE (Information
Theoretical Estimators) toolbox. 

The ITE package could be of interest to many of you: the estimation of
information theoretical quantities (entropy, mutual information,
divergence) plays a central role in numerous important problems of
machine learning.

Unfortunately, available packages focus on (i) discrete variables, or
(ii) quite specialized applications and information theoretical
estimation methods. To fill in this serious gap, we have recently
released ITE (i) a highly modular, (ii) free and open source, (iii)
multi-platform toolbox, which

1. is capable of estimating many different variants of entropy, mutual
information and divergence measures.

2. offers a simple and unified framework to (a) easily construct new
estimators from existing ones or from scratch, and (b) transparently
use the obtained estimators in information theoretical optimization
problems. 

3. with a prototype application in a central problem family of signal
processing, independent subspace analysis and its extensions.

The homepage of ITE is "https://bitbucket.org/szzoli/ite/";. Feel free
to use it.

Best, 

Zoltan ("http://nipg.inf.elte.hu/szzoli";)
_______________________________________________
uai mailing list
[email protected]
https://secure.engr.oregonstate.edu/mailman/listinfo/uai

Reply via email to