On Sun, Aug 13, 2006 at 04:15:30AM +0100, Russell Wallace wrote:
An unusual claim... do you mean all knowledge can be learned verbally,
or do you think there are some kinds of knowledge that cannot be
demonstrated verbally?
Language can be used to serialize and transfer state of
On 8/20/06, Eugen Leitl [EMAIL PROTECTED] wrote:
Language can be used to serialize and transfer state of cloned objects.This doesn't mean human experts know their inner state, or can freezeand serialize it, and that other instances can instantiate such serialized
state.
Exactly.
To unsubscribe,
The problem is only half fulfilled with the enourmous size of the corpus. A huge corpus will only give you plain predictive ability, reasoning and other abilities are not greatly approved.So a simple interaction.Person: My name is BosefusAI: okPerson: What is my name?AI: dunnoNow it is true that
I dont believe "yellow" or "hot" can either be learned verbally.JamesRussell Wallace [EMAIL PROTECTED] wrote: On 8/13/06, Matt Mahoney [EMAIL PROTECTED] wrote: There is no knowledge that you can demonstrate verbally that cannot also be learned verbally. An unusual claim... do you mean all
, August 16, 2006 1:23
PM
Subject: Re: [agi] Marcus Hutter's
lossless compression of human knowledge prize
I dont believe "yellow" or "hot" can either be learned
verbally.JamesRussell Wallace [EMAIL PROTECTED]
wrote:
On
8/13/06, Matt Mahoney [EM
: Re: [agi] Marcus Hutter's lossless compression of human knowledge prize I dont believe "yellow" or "hot" can either be learnedverbally.JamesRussell Wallace [EMAIL PROTECTED]wrote: On 8/13/06, Matt Mahoney [EMAIL PROTECTED] wrote:
.listbox.com
Sent: Wednesday, August 16, 2006 2:42
PM
Subject: **SPAM** Re: [agi] Marcus
Hutter's lossless compression of human knowledge prize
Now try that on my daughter or any other 3.5 year old. It
doesnt work. :}No, as another poster stated better, there are many
things that cant b
I proposed knowledge-based text compression as a dissertation topic,
back around 1991, but my advisor turned it down. I never got back to
the topic because there wasn't any money in it - text is already so
small, relative to audio and video, that it was clear that the money
was in audio and
PROTECTED]
To: agi@v2.listbox.com
Sent: Monday, August 14, 2006 10:27:41 AM
Subject: Re: [agi] Marcus Hutter's lossless compression of human knowledge
prize
When the same idea is expressed twice in the same corpus (which happens
frequently in real text), a compressor that recognizes
On 8/14/06, Mark Waser [EMAIL PROTECTED] wrote:
Any large corpus of text is going to contain redundant information. Usually is it not coded explicitly like Ben is a person.Rather it is implicit, like Ben said, from which you can infer that Ben is a person.
OK.But what does that inference buy you
One would hope that a good lossy compression would
(a) regularize writing style
(b) correct mispellings
(c) correct contradictions
and possibly other beneficial effects, as well as omitting trivia.
I'll bet that a multilevel HMM could do a fairly decent job of a and b, and
maybe a little bit of
in long term memory, Cognitive Science (10) pp.
477-493.
-- Matt Mahoney, [EMAIL PROTECTED]
- Original Message
From: Pei Wang [EMAIL PROTECTED]
To: agi@v2.listbox.com
Sent: Saturday, August 12, 2006 4:03:55 PM
Subject: Re: [agi] Marcus Hutter's lossless compression of human knowledge
I will try to answer several posts here.First, I said that there is no knowledge that you can demonstrate verbally that cannot also be learned verbally. For simple cases, this is easy to show. If you test for knowledge X by asking question Q, expecting answer A, then you can train a machine "the
Matt,
You've stated that any knowledge that can be demonstrated verbally CAN
in principle be taught verbally. I don't agree that this is
necessarily true for ANY learning system, but that's not the point I
want to argue.
My larger point is that this doesn't imply that this is how humans do
to your response . . . .
- Original Message -
From: Eliezer S. Yudkowsky [EMAIL PROTECTED]
To: agi@v2.listbox.com
Sent: Sunday, August 13, 2006 1:18 AM
Subject: **SPAM** Re: [agi] Marcus Hutter's lossless compression of human
knowledge prize
As long as we're talking about fantasy
to
something that *really* measures something like intelligence?
- Original Message -
From: Mark Waser [EMAIL PROTECTED]
To: agi@v2.listbox.com
Sent: Sunday, August 13, 2006 3:11 PM
Subject: Re: [agi] Marcus Hutter's lossless compression of human knowledge
prize
Hi all,
I think
Hi all,
I think that a few important points have been lost or misconstrued in
most of this discussion.
First off, there is a HUGE difference between the compression of
knowledge and the compression of strings. The strings Ben is human., Ben
is a member of the species homo sapiens., Ben is
Hi all,
I think that a few important points have been lost or misconstrued in
most of this discussion.
First off, there is a HUGE difference between the compression of
knowledge and the compression of strings. The strings Ben is human., Ben
is a member of the species homo sapiens.,
Shane,
I don't think the statement the universe is computable can either be
proved or disproved, because it is not a mathematical statement.
However, there can be evidence for or against it.
My objection is not any one you anticipated, but much simpler. As Hume
argued, I don't think we can
Mark Waser wrote:
Hi all,
I think that a few important points have been lost or misconstrued
in most of this discussion.
First off, there is a HUGE difference between the compression of
knowledge and the compression of strings. The strings Ben is
human., Ben is a member of the
.listbox.com
Sent: Sunday, August 13, 2006 3:25:41 PM
Subject: Re: [agi] Marcus Hutter's lossless compression of human knowledge prize
Matt,
You've stated that any knowledge that can be demonstrated verbally CAN
in principle be taught verbally. I don't agree that this is
necessarily true for ANY
: Re: [agi] Marcus Hutter's lossless compression of human knowledge prize
I think the Hutter prize will lead to a better understading of how we
learn semantics and syntax.
I have to disagree strongly. As long as you a requiring recreation at the
bit level as opposed to the semantic or logical
Ben,So you think that, Powerful AGI == good Hutter test resultBut you have a problem with the reverse implication,good Hutter test result =/= Powerful AGIIs this correct?
Shane
To unsubscribe, change your address, or temporarily deactivate your subscription,
please go to
A common objection to compression as a test for AI is that humans can't do
compression, so it has nothing to do with AI. The reason people can't compress
is that compression requires both AI and deterministic computation. The human
brain is not deterministic because it is made of neurons,
Howdy Shane,
I'll try to put my views in your format
I think that
Extremely powerful, vastly superhuman AGI == outstanding Hutter test result
whereas
Human-level AGI =/= Good Hutter test result
just as
Human =/= Good Hutter test result
and for this reason I consider the Hutter test a
That seems clear.Human-level AGI =/= Good Hutter test result
just asHuman =/= Good Hutter test resultMy suggestion then is to very slightly modify the test as follows: Instead of just getting the raw characters, what you get is thesequence of characters and the probability distribution over the
Matt,
To summarize and generalize data and to use the summary to predict the
future is no doubt at the core of intelligence. However, I do not call
this process compressing, because the result is not faultless, that
is, there is information loss.
It is not only because the human brains are
Yes, I think a hybridized AGI and compression algorithm could dobetter than either one on its ownHowever, this might result in
an incredibly slow compression process, depending on how fast the AGIthinks.(It would take ME a long time to carry out this process overthe whole Hutter
I don't think it's anywhere near that much. I read at about 2 KB
per minute, and I listen to speech (if written down as plain text)
at a roughly similar speed. If you then work it out, buy the time
I was 20 I'd read/heard not more than 2 or 3 GB of raw text.
If you could compress/predict
On 8/12/06, Matt Mahoney [EMAIL PROTECTED] wrote:
In order to compress text well, the compressor must be able to estimate probabilities over text strings, i.e. predict text.
Um no, the compressor doesn't need to predict anything - it has the entire file already at hand.
The _de_compressor would
First, the compression problem is not in NP. The general problem of encoding strings as the smallest programs to output them is undecidable.Second, given a model, then compression is the same as prediction. A model is a function that maps any string s to an estimated probability p(s). A compressor
On 8/12/06, Matt Mahoney [EMAIL PROTECTED] wrote:
First,
the compression problem is not in NP. The general problem of
encoding strings as the smallest programs to output them is undecidable.
But as I said, it becomes NP when there's an upper limit to decompression time.
Second,
given a model,
But Shane, your 19 year old self had a much larger and more diversevolume of data to go on than just the text or speech that you
ingested...I would claim that a blind and deaf person at 19 could pass aTuring test if they had been exposed to enough information overthe years. Especially if they had
On 8/13/06, Matt Mahoney [EMAIL PROTECTED] wrote:
Whether
or not a compressor implements a model as a predictor or not is
irrelevant. Modeling the entire input at once is mathematically
equivalent to predicting successive symbols. Even if you think
you are not modeling, you are. If you design a
.
477-493.
-- Matt Mahoney, [EMAIL PROTECTED]
- Original Message
From: Pei Wang [EMAIL PROTECTED]
To: agi@v2.listbox.com
Sent: Saturday, August 12, 2006 4:03:55 PM
Subject: Re: [agi] Marcus Hutter's lossless compression of human knowledge prize
Matt,
To summarize and generalize data
I think compression isessential tointelligence,but the difference between lossy and lossless may make the algorithms quite different.
But why notlet competitorscompress lossily?As far asprediction goes, the testing part is still the same!
If you guys have a lossy version of the prize I will
]
To: agi@v2.listbox.com
Sent: Saturday, August 12, 2006 8:53:40 PM
Subject: Re: [agi] Marcus Hutter's lossless compression of human knowledge prize
Matt,
So you mean we should leave forgetting out of the picture, just
because we don't know how to objectively measure it.
Though objectiveness is indeed
On 8/13/06, Matt Mahoney [EMAIL PROTECTED] wrote:
There
is no knowledge that you can demonstrate verbally that cannot also be
learned verbally.
An unusual claim... do you mean all knowledge can be learned verbally,
or do you think there are some kinds of knowledge that cannot be
demonstrated
On Aug 12, 2006, at 6:27 PM, Yan King Yin wrote:
I think compression is essential to intelligence, but the
difference between lossy and lossless may make the algorithms quite
different.
For general algorithms (e.g. ones that do not play to the sensory
biases of humans) there should be
As long as we're talking about fantasy applications that require
superhuman AGI, I'd be impressed by a lossy compression of Wikipedia
that decompressed to a non-identical version carrying the same semantic
information.
--
Eliezer S. Yudkowsky http://singinst.org/
40 matches
Mail list logo