in the cortex.
10^8 seconds is 3 years! I think that number's wrong.
--
Philip Hunt, [EMAIL PROTECTED]
Please avoid sending me Word or PowerPoint attachments.
See http://www.gnu.org/philosophy/no-word-attachments.html
---
agi
Archives: https://www.listbox.com
the Loebner prize is silly.
--
Philip Hunt, [EMAIL PROTECTED]
Please avoid sending me Word or PowerPoint attachments.
See http://www.gnu.org/philosophy/no-word-attachments.html
---
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https
, is there something to AIXI or is it something I can safely ignore?
--
Philip Hunt, [EMAIL PROTECTED]
Please avoid sending me Word or PowerPoint attachments.
See http://www.gnu.org/philosophy/no-word-attachments.html
---
agi
Archives: https://www.listbox.com
/Function_predictor )
I also think it would be useful if there was a regular (maybe annual)
competition in the function predictor domain (or some similar domain).
A bit like the Loebner Prize, except that it would be more useful to
the advancement of AI, since the Loebner prize is silly.
--
Philip Hunt
theories, they are merely rewordings of the same theory. And choosing
between them is arbitrary; you may prefer one to the other because
human minds can visualise it more easily, or it's easier to calculate,
or you have an aethetic preference for it.
--
Philip Hunt, [EMAIL PROTECTED]
Please avoid
That was helpful. Thanks.
2008/12/1 Matt Mahoney [EMAIL PROTECTED]:
--- On Sun, 11/30/08, Philip Hunt [EMAIL PROTECTED] wrote:
Can someone explain AIXI to me?
AIXI models an intelligent agent interacting with an environment as a pair of
interacting Turing machines. At each step, the agent
in the
mammalian immune system does change as the immune system evolves to
cope with infectious agents; but these changes aren't passed along to
the next generation.)
* if there are any molecular biologists reading, feel free to correct me.
--
Philip Hunt, [EMAIL PROTECTED]
Please avoid sending me
. IIRC that was
the rough order of magnitude assumed in the proposal I reviewed here
recently.
It might well be. It is anyway apparent that there are different
mechanisms in the brain for laying down long-term memories and for
short-term thinking over the order of a few seconds.
--
Philip Hunt
that nanotechnology or AI are
specifically prohibited by any of the major religions. And if one
society forgoes science, they'll just get outcompeted by their
neighbours.
--
Philip Hunt, [EMAIL PROTECTED]
Please avoid sending me Word or PowerPoint attachments.
See http://www.gnu.org/philosophy/no-word
to train it in the real world (at least some of the time).
If you don't care whether your AGI can use a screwdriver, why have one
in the virtual world?
--
Philip Hunt, cabala...@googlemail.com
Please avoid sending me Word or PowerPoint attachments.
See http://www.gnu.org/philosophy/no-word
to
be anything like a virtual world, it could for example be a software
modality that can see/understand source code as easily and fluently
as humans interprete visual input.)
AIUI you're mostly thinking in terms of 2 or 3. Fair comment?
--
Philip Hunt, cabala...@googlemail.com
Please avoid
, even if they look the same.
An animals intuitive physics is a complex system. I expect that in
humans a lot of this machinery isd re-used to create intelligence. (It
may be true, and IMO probably is true, that it's not necessary to
re-create this machinery to make an AGI).
--
Philip Hunt, cabala
that).
--
Philip Hunt, cabala...@googlemail.com
Please avoid sending me Word or PowerPoint attachments.
See http://www.gnu.org/philosophy/no-word-attachments.html
---
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https
help too.
--
Philip Hunt, cabala...@googlemail.com
Please avoid sending me Word or PowerPoint attachments.
See http://www.gnu.org/philosophy/no-word-attachments.html
---
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https
).
On the other hand, making a virtual world such as I envision, is more than a
spare-time project, but not more than the project of making a single
high-quality video game.
GTA IV cost $5 million, so we're not talking about peanuts here.
--
Philip Hunt, cabala...@googlemail.com
Please avoid
comes in.
Actually, $$ aside, we don't even **know how** to make a decent humanoid
robot.
Or, a decently functional mobile robot **of any kind**
Is that because of hardware or software issues?
--
Philip Hunt, cabala...@googlemail.com
Please avoid sending me Word or PowerPoint attachments
it.
Until an AI can do this, there's no point in trying to get it to play
at making cakes, etc.
--
Philip Hunt, cabala...@googlemail.com
Please avoid sending me Word or PowerPoint attachments.
See http://www.gnu.org/philosophy/no-word-attachments.html
it interacts
with our internal model of the world, than vision is.
Is the reason just that AI researchers spend all day staring at screens and
ignoring their physical bodies and surroundings?? ;-)
:-)
--
Philip Hunt, cabala...@googlemail.com
Please avoid sending me Word or PowerPoint attachments.
See
in this field?
I've never used formal proofs of correctness of software, so can't
comment. I use software testing (unit tests) on pretty much all
non-trivial software thast I write -- i find doing so makes things
much easier.
--
Philip Hunt, cabala...@googlemail.com
Please avoid sending me Word
computer to do tasks better than they can (e.g.
play chess) and I see no reason why it shouldn't be possible for self
awareness. Indeed it would be rather trivial to give an AGI access to
its source code.
--
Philip Hunt, cabala...@googlemail.com
Please avoid sending me Word or PowerPoint
intelligence this way. Care to enlighten me?
--
Philip Hunt, cabala...@googlemail.com
Please avoid sending me Word or PowerPoint attachments.
See http://www.gnu.org/philosophy/no-word-attachments.html
---
agi
Archives: https://www.listbox.com/member/archive/303
be a more useful one.
While you're at it you may want to change the size of the chunks in
each item of prediction, from characters to either strings or
s-expressions. Though doing so doesn't fundamentally alter the
problem.
--
Philip Hunt, cabala...@googlemail.com
Please avoid sending me Word
code.
--
Philip Hunt, cabala...@googlemail.com
Please avoid sending me Word or PowerPoint attachments.
See http://www.gnu.org/philosophy/no-word-attachments.html
---
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https
2008/12/27 Matt Mahoney matmaho...@yahoo.com:
--- On Fri, 12/26/08, Philip Hunt cabala...@googlemail.com wrote:
Humans are very good at predicting sequences of
symbols, e.g. the next word in a text stream.
Why not have that as your problem domain, instead of text
compression?
That's
2008/12/28 Philip Hunt cabala...@googlemail.com:
Now, consider if I build a program that can predict how some sequences
will continue. For example, given
ABACADAEA
it'll predict the next letter is F, or given:
1 2 4 8 16 32
it'll predict the next number is 64. (Whether the program
at prediction. Whereas all programs that're good at
prediction are guaranteed to be good at prediction.
--
Philip Hunt, cabala...@googlemail.com
Please avoid sending me Word or PowerPoint attachments.
See http://www.gnu.org/philosophy/no-word-attachments.html
2008/12/29 Philip Hunt cabala...@googlemail.com:
2008/12/29 Matt Mahoney matmaho...@yahoo.com:
Please remember that I am not proposing compression as a solution to the AGI
problem. I am proposing it as a measure of progress in an important
component (prediction).
[...]
Turning
2008/12/29 Matt Mahoney matmaho...@yahoo.com:
--- On Mon, 12/29/08, Philip Hunt cabala...@googlemail.com wrote:
Incidently, reading Matt's posts got me interested in writing a
compression program using Markov-chain prediction. The prediction bit
was a piece of piss to write; the compression
processing power you need: if processing is very
expensive, it makes less sense to re-run an extensive test suite
whenever you make a change.
--
Philip Hunt, cabala...@googlemail.com
Please avoid sending me Word or PowerPoint attachments.
See http://www.gnu.org/philosophy/no-word-attachments.html
to say so and make your assumptions concrete.
--
Philip Hunt, cabala...@googlemail.com
Please avoid sending me Word or PowerPoint attachments.
See http://www.gnu.org/philosophy/no-word-attachments.html
---
agi
Archives: https://www.listbox.com/member/archive
30 matches
Mail list logo