From: Lukasz Stafiniak [EMAIL PROTECTED]
The programs are generally required to exactly match in AIXI (but not
in AIXItl I think).
I'm pretty sure AIXItl wants an exact match too. There isn't anything
there that lets the theoretical AI guess probability distributions and
then get scored based on
On Nov 10, 2007 4:47 PM, Tim Freeman [EMAIL PROTECTED] wrote:
From: Lukasz Stafiniak [EMAIL PROTECTED]
The programs are generally required to exactly match in AIXI (but not
in AIXItl I think).
I'm pretty sure AIXItl wants an exact match too. There isn't anything
there that lets the
On Nov 9, 2007 5:26 AM, Edward W. Porter [EMAIL PROTECTED] wrote:
ED ## what is the value or advantage of conditional complexities
relative to conditional probabilities?
Kolmogorov complexity is universal. For probabilities, you need to
specify the probability space and initial distribution
]
-Original Message-
From: Lukasz Stafiniak [mailto:[EMAIL PROTECTED]
Sent: Friday, November 09, 2007 7:13 AM
To: agi@v2.listbox.com
Subject: Re: [agi] How valuable is Solmononoff Induction for real world
AGI?
On Nov 9, 2007 5:26 AM, Edward W. Porter [EMAIL PROTECTED] wrote:
ED ## what
On Nov 9, 2007 5:26 AM, Edward W. Porter [EMAIL PROTECTED] wrote:
So are the programs just used for computing Kolmogorov complexity or are
they also used for generating and matching patterns.
The programs do not compute K complexity, they (their length) _are_ (a
variant of) Kolmogorov
valuable is Solmononoff Induction for real world
AGI?
On 11/8/07, Edward W. Porter [EMAIL PROTECTED] wrote:
ED Most importantly you say my alleged confusion between
subjective and objective maps into my difficulty to grasp the
significance of Solomonoff induction. If you could do so
I recently found this paper to contain some thinking worthwhile to the
considerations in this thread.
http://lcsd05.cs.tamu.edu/papers/veldhuizen.pdf
- Jef
-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
Is there any research that can tell us what kind of structures are better
for machine learning? Or perhaps w.r.t a certain type of data? Are there
learning structures that will somehow learn things faster?
There is plenty of knowledge about which learning algorithms are better for
which
My impression is that most machine learning theories assume a search space
of hypotheses as a given, so it is out of their scope to compare *between*
learning structures (eg, between logic and neural networks).
Algorithmic learning theory - I don't know much about it - may be useful
because it
Thanks for the input.
There's one perplexing theorem, in the paper about the algorithmic
complexity of programming, that the language doesn't matter that much, ie,
the algorithmic complexity of a program in different languages only differ
by a constant. I've heard something similar about the
On 08/11/2007, YKY (Yan King Yin) [EMAIL PROTECTED] wrote:
My impression is that most machine learning theories assume a search space
of hypotheses as a given, so it is out of their scope to compare *between*
learning structures (eg, between logic and neural networks).
Algorithmic learning
VLADIMIR NESOV IN HIS 11/07/07 10:54 PM POST SAID
VLADIMIR Hutter shows that prior can be selected rather arbitrarily
without giving up too much
ED Yes. I was wondering why the Solomonoff Induction paper made such
a big stink about picking the prior (and then came up which a choice that
From: Jef Allbright [mailto:[EMAIL PROTECTED]
I recently found this paper to contain some thinking worthwhile to the
considerations in this thread.
http://lcsd05.cs.tamu.edu/papers/veldhuizen.pdf
This is an excellent paper not in only the subject of code reuse but also of
the techniques
? Any ideas?
Ed Porter
-Original Message-
From: Jef Allbright [mailto:[EMAIL PROTECTED]
Sent: Thursday, November 08, 2007 9:56 AM
To: agi@v2.listbox.com
Subject: Re: [agi] How valuable is Solmononoff Induction for real world
AGI?
I recently found this paper to contain some thinking
On 08/11/2007, YKY (Yan King Yin) [EMAIL PROTECTED] wrote:
Thanks for the input.
There's one perplexing theorem, in the paper about the algorithmic
complexity of programming, that the language doesn't matter that much, ie,
the algorithmic complexity of a program in different languages only
On 08/11/2007, Jef Allbright [EMAIL PROTECTED] wrote:
I'm sorry I'm not going to be able to provide much illumination for
you at this time. Just the few sentences of yours quoted above, while
of a level of comprehension equal or better than average on this list,
demonstrate epistemological
more people like you start contributing again.
Ed Porter
-Original Message-
From: Derek Zahn [mailto:[EMAIL PROTECTED]
Sent: Thursday, November 08, 2007 3:05 PM
To: agi@v2.listbox.com
Subject: RE: [agi] How valuable is Solmononoff Induction for real world
AGI?
Edward,
For some
-
From: Jef Allbright [mailto:[EMAIL PROTECTED]
Sent: Thursday, November 08, 2007 12:55 PM
To: agi@v2.listbox.com
Subject: Re: [agi] How valuable is Solmononoff Induction for real world
AGI?
On 11/8/07, Edward W. Porter [EMAIL PROTECTED] wrote:
Jef,
The paper cited below is more relevant
Edward,
For some reason, this list has become one of the most hostile and poisonous
discussion forums around. I admire your determined effort to hold substantive
conversations here, and hope you continue. Many of us have simply given up.
-
This list is sponsored by AGIRI:
Cool!
-Original Message-
From: Benjamin Goertzel [mailto:[EMAIL PROTECTED]
Sent: Thursday, November 08, 2007 12:56 PM
To: agi@v2.listbox.com
Subject: Re: [agi] How valuable is Solmononoff Induction for real world
AGI?
Yeah, we use Occam's razor heuristics in Novamente
On 11/8/07, Edward W. Porter [EMAIL PROTECTED] wrote:
Jeff,
In your below flame you spent much more energy conveying contempt than
knowledge.
I'll readily apologize again for the ineffectiveness of my
presentation, but I meant no contempt.
Since I don't have time to respond to all of your
On 11/8/07, Edward W. Porter [EMAIL PROTECTED] wrote:
In my attempt to respond quickly I did not intended to attack him or
his paper
Edward -
I never thought you were attacking me.
I certainly did attack some of your statements, but I never attacked you.
It's not my paper, just one that I
On 11/8/07, Edward W. Porter [EMAIL PROTECTED] wrote:
HOW VALUABLE IS SOLMONONOFF INDUCTION FOR REAL WORLD AGI?
I will use the opportunity to advertise my equation extraction of
the Marcus Hutter UAI book.
And there is a section at the end about Juergen Schmidhuber's ideas,
from the older
) 494-1822
[EMAIL PROTECTED]
-Original Message-
From: Jef Allbright [mailto:[EMAIL PROTECTED]
Sent: Thursday, November 08, 2007 4:22 PM
To: agi@v2.listbox.com
Subject: Re: [agi] How valuable is Solmononoff Induction for real world
AGI?
On 11/8/07, Edward W. Porter [EMAIL PROTECTED] wrote
Subject: Re: [agi] How valuable is Solmononoff Induction for real world
AGI?
On 11/8/07, Edward W. Porter [EMAIL PROTECTED] wrote:
VLADIMIR NESOV IN HIS 11/07/07 10:54 PM POST SAID
VLADIMIR Hutter shows that prior can be selected rather
VLADIMIR arbitrarily
without giving up too much
BTW
25 matches
Mail list logo