----- Forwarded message from David 'Pablo' Cohn <[EMAIL PROTECTED]> -----
    Date: 13 Nov 2003 11:17:23 -0800
    From: David 'Pablo' Cohn <[EMAIL PROTECTED]>
Reply-To: David 'Pablo' Cohn <[EMAIL PROTECTED]>

The Journal of Machine Learning Research (www.jmlr.org) is pleased to
announce publication of a new paper:
----------------------------------------------------------------------------

Optimality of Universal Bayesian Sequence Prediction
for General Loss and Alphabet
Marcus Hutter
JMLR 4(Nov):971-1000, 2003

Abstract

Various optimality properties of universal sequence predictors based on
Bayes-mixtures in general, and Solomonoff's prediction scheme in
particular, will be studied. The probability of observing xt at time t,
given past observations x1...xt-1 can be computed with the chain rule if
the true generating distribution  of the sequences x1x2x3.... is known.
If  is unknown, but known to belong to a countable or continuous class
 one can base ones prediction on the Bayes-mixture  defined as a
w-weighted sum or integral of distributions   . The cumulative
expected loss of the Bayes-optimal universal prediction scheme based on
 is shown to be close to the loss of the Bayes-optimal, but infeasible
prediction scheme based on . We show that the bounds are tight and that
no other predictor can lead to significantly smaller bounds.
Furthermore, for various performance measures, we show Pareto-optimality
of  and give an Occam's razor argument that the choice w  2-K() for
the weights is optimal, where K() is the length of the shortest program
describing . The results are applied to games of chance, defined as a
sequence of bets, observations, and rewards. The prediction schemes (and
bounds) are compared to the popular predictors based on expert advice.
Extensions to infinite alphabets, partial, delayed and probabilistic
prediction, classification, and more active systems are briefly
discussed.

----------------------------------------------------------------------------
This paper, and all previous papers in Volume 4 are available
electronically at http://www.jmlr.org in PostScript and PDF formats. The
papers of Volumes 1, 2 and 3 are also available electronically from the
JMLR website, and in hardcopy from the MIT Press; please see
http://mitpress.mit.edu/JMLR for details.

-David Cohn, <[EMAIL PROTECTED]>
--- Begin Message ---
You  are  not  authorized  to  send  mail  to  the  INDUCTIVE  list  from  your
[EMAIL PROTECTED] account.  You might be authorized  to send to the  list from
another of  your accounts,  or perhaps  when using  another mail  program which
generates slightly  different addresses, but  LISTSERV has no way  to associate
this other account or address with yours. If you need assistance or if you have
any question  regarding the policy  of the  INDUCTIVE list, please  contact the
list owners: [EMAIL PROTECTED]

------------------------ Rejected message (67 lines) --------------------------
Received: from mailserv3.unb.ca (mailserv3.unb.ca [131.202.3.87])
        by hermes.csd.unb.ca (8.12.10/8.12.3) with ESMTP id hADIqZqV005757
        for <[EMAIL PROTECTED]>; Thu, 13 Nov 2003 14:52:35 -0400 (AST)
Received: from 216-239-45-4.google.com (216-239-45-4.google.com [216.239.45.4])
        by mailserv3.unb.ca (8.12.10/8.12.6) with ESMTP id hADIqNjp020824
        for <[EMAIL PROTECTED]>; Thu, 13 Nov 2003 14:52:25 -0400
Received: from bitbox.corp.google.com (bitbox.corp.google.com [10.3.18.73])
        by 216-239-45-4.google.com (8.12.9/8.12.9) with ESMTP id hADIpNHC015047;
        Thu, 13 Nov 2003 10:51:23 -0800
Subject: jmlr-announce: Optimality of Universal Bayesian Sequence
        Prediction for General Loss and Alphabet
From: "David 'Pablo' Cohn" <[EMAIL PROTECTED]>
To: [EMAIL PROTECTED]
Content-Type: text/plain; charset=UTF-8
Organization:
Message-Id: <[EMAIL PROTECTED]>
Mime-Version: 1.0
X-Mailer: Ximian Evolution 1.2.2 (1.2.2-5)
Date: 13 Nov 2003 10:51:23 -0800
Content-Transfer-Encoding: 8bit
X-UNB-MailScanner-Information: Please contact the ISP for more information
X-UNB-VirusScanner: Found to be clean
X-UNB-SpamDetails: not spam, SpamAssassin (score=-4.9, required 5,
        BAYES_00 -4.90)

The Journal of Machine Learning Research (www.jmlr.org) is pleased to
announce publication of a new paper:
----------------------------------------------------------------------------

Optimality of Universal Bayesian Sequence Prediction
for General Loss and Alphabet
Marcus Hutter
JMLR 4(Nov):971-1000, 2003

Abstract

Various optimality properties of universal sequence predictors based on
Bayes-mixtures in general, and Solomonoff's prediction scheme in
particular, will be studied. The probability of observing xt at time t,
given past observations x1...xt-1 can be computed with the chain rule if
the true generating distribution  of the sequences x1x2x3.... is known.
If  is unknown, but known to belong to a countable or continuous class
 one can base ones prediction on the Bayes-mixture  defined as a
w-weighted sum or integral of distributions   . The cumulative
expected loss of the Bayes-optimal universal prediction scheme based on
 is shown to be close to the loss of the Bayes-optimal, but infeasible
prediction scheme based on . We show that the bounds are tight and that
no other predictor can lead to significantly smaller bounds.
Furthermore, for various performance measures, we show Pareto-optimality
of  and give an Occam's razor argument that the choice w  2-K() for
the weights is optimal, where K() is the length of the shortest program
describing . The results are applied to games of chance, defined as a
sequence of bets, observations, and rewards. The prediction schemes (and
bounds) are compared to the popular predictors based on expert advice.
Extensions to infinite alphabets, partial, delayed and probabilistic
prediction, classification, and more active systems are briefly
discussed.

----------------------------------------------------------------------------
This paper, and all previous papers in Volume 4 are available
electronically at http://www.jmlr.org in PostScript and PDF formats. The
papers of Volumes 1, 2 and 3 are also available electronically from the
JMLR website, and in hardcopy from the MIT Press; please see
http://mitpress.mit.edu/JMLR for details.

-David Cohn, <[EMAIL PROTECTED]>


--- End Message ---
You  are  not  authorized  to  send  mail  to  the  INDUCTIVE  list  from  your
[EMAIL PROTECTED] account.  You might be authorized  to send to the  list from
another of  your accounts,  or perhaps  when using  another mail  program which
generates slightly  different addresses, but  LISTSERV has no way  to associate
this other account or address with yours. If you need assistance or if you have
any question  regarding the policy  of the  INDUCTIVE list, please  contact the
list owners: [EMAIL PROTECTED]

------------------------ Rejected message (67 lines) --------------------------
Received: from mailserv3.unb.ca (mailserv3.unb.ca [131.202.3.87])
        by hermes.csd.unb.ca (8.12.10/8.12.3) with ESMTP id hADIqZqV005757
        for <[EMAIL PROTECTED]>; Thu, 13 Nov 2003 14:52:35 -0400 (AST)
Received: from 216-239-45-4.google.com (216-239-45-4.google.com [216.239.45.4])
        by mailserv3.unb.ca (8.12.10/8.12.6) with ESMTP id hADIqNjp020824
        for <[EMAIL PROTECTED]>; Thu, 13 Nov 2003 14:52:25 -0400
Received: from bitbox.corp.google.com (bitbox.corp.google.com [10.3.18.73])
        by 216-239-45-4.google.com (8.12.9/8.12.9) with ESMTP id hADIpNHC015047;
        Thu, 13 Nov 2003 10:51:23 -0800
Subject: jmlr-announce: Optimality of Universal Bayesian Sequence
        Prediction for General Loss and Alphabet
From: "David 'Pablo' Cohn" <[EMAIL PROTECTED]>
To: [EMAIL PROTECTED]
Content-Type: text/plain; charset=UTF-8
Organization:
Message-Id: <[EMAIL PROTECTED]>
Mime-Version: 1.0
X-Mailer: Ximian Evolution 1.2.2 (1.2.2-5)
Date: 13 Nov 2003 10:51:23 -0800
Content-Transfer-Encoding: 8bit
X-UNB-MailScanner-Information: Please contact the ISP for more information
X-UNB-VirusScanner: Found to be clean
X-UNB-SpamDetails: not spam, SpamAssassin (score=-4.9, required 5,
        BAYES_00 -4.90)

The Journal of Machine Learning Research (www.jmlr.org) is pleased to
announce publication of a new paper:
----------------------------------------------------------------------------

Optimality of Universal Bayesian Sequence Prediction
for General Loss and Alphabet
Marcus Hutter
JMLR 4(Nov):971-1000, 2003

Abstract

Various optimality properties of universal sequence predictors based on
Bayes-mixtures in general, and Solomonoff's prediction scheme in
particular, will be studied. The probability of observing xt at time t,
given past observations x1...xt-1 can be computed with the chain rule if
the true generating distribution  of the sequences x1x2x3.... is known.
If  is unknown, but known to belong to a countable or continuous class
 one can base ones prediction on the Bayes-mixture  defined as a
w-weighted sum or integral of distributions   . The cumulative
expected loss of the Bayes-optimal universal prediction scheme based on
 is shown to be close to the loss of the Bayes-optimal, but infeasible
prediction scheme based on . We show that the bounds are tight and that
no other predictor can lead to significantly smaller bounds.
Furthermore, for various performance measures, we show Pareto-optimality
of  and give an Occam's razor argument that the choice w  2-K() for
the weights is optimal, where K() is the length of the shortest program
describing . The results are applied to games of chance, defined as a
sequence of bets, observations, and rewards. The prediction schemes (and
bounds) are compared to the popular predictors based on expert advice.
Extensions to infinite alphabets, partial, delayed and probabilistic
prediction, classification, and more active systems are briefly
discussed.

----------------------------------------------------------------------------
This paper, and all previous papers in Volume 4 are available
electronically at http://www.jmlr.org in PostScript and PDF formats. The
papers of Volumes 1, 2 and 3 are also available electronically from the
JMLR website, and in hardcopy from the MIT Press; please see
http://mitpress.mit.edu/JMLR for details.

-David Cohn, <[EMAIL PROTECTED]>

Reply via email to