On 2/19/08, Ben Goertzel [EMAIL PROTECTED] wrote:
If we need a KB orders of magnitude larger to make that approach work,
doesn't that mean we should use another approach?
But do you agree that a KB orders of magnitude larger is required for all
AGI, regardless of *how* the knowledge is
On 20/02/2008, YKY (Yan King Yin) [EMAIL PROTECTED] wrote:
E is also hard, but you seem to be *unaware* of its difficulty. In fact,
the problem with E is the same as that with AIXI -- the thoery is elegant,
but the actual learning would take forever. Can you explain, in broad
terms, how the
C is not very viable as of now. The physics in Second Life is simply not
*rich* enough. SL is mainly a space for humans to socialize, so the physics
will not get much richer in the near future -- is anyone interested in
emulating cigarette smoke in SL?
Second Life will soon be integrating
Water does not always run downhill, sometimes it runs uphill.
But never without a reason.
- Original Message -
From: Ben Goertzel [EMAIL PROTECTED]
To: agi@v2.listbox.com
Sent: Wednesday, February 20, 2008 9:47 AM
Subject: Re: [agi] would anyone want to use a commonsense KB?
C is
--- YKY (Yan King Yin) [EMAIL PROTECTED] wrote:
Let me list all the ways of AGI knowledge acquisition:
A) manual encoding in logical form
B) manual teaching in NL and pictures
C) learning in virtual reality (eg Second Life)
D) embodied learning (eg computer vision)
E) inductive
On Feb 20, 2008 1:34 PM, J Storrs Hall, PhD [EMAIL PROTECTED] wrote:
Looking at the moon won't help --
of course it helps, it tells you that something odd is with the expression,
as opposed to say yellow sun ...
it might be the case that it described a
particular appearance that only had a
Looking at the moon won't help -- it might be the case that it described a
particular appearance that only had a slight resemblance to other blue things
(as in red hair), for example. There are some rare conditions (high
stratospheric dust) which can make the moon look actually blue.
In fact
So, looking at the moon, what color would you say it was?
Here's what text mining might give you (Google hits):
blue moon 11,500,000
red moon 1,670,000
silver moon 1,320,000
yellow moon 712,000
white moon 254,000
golden moon 163,000
orange moon 122,000
green moon 105,000
gray moon 9,460
To me,
There seems to be an assumption in this thread that NLP analysis
of text is restricted to simple statistical extraction of word-sequences...
This is not the case...
If there were to be a hope for AGI based on text analysis, it would have
to be based on systems that parse linguistic expressions
On Wednesday 20 February 2008 02:58:54 pm, Ben Goertzel wrote:
I note also that a web-surfing AGI could resolve the color of the moon
quite easily by analyzing online pictures -- though this isn't pure
text mining, it's in the same spirit...
U -- I just typed moon into google and at the
As I am sure you are fully aware, you can't parse English without a knowledge
of the meanings involved. (The council opposed the demonstrators because
they (feared/advocated) violence.) So how are you going to learn meanings
before you can parse, or how are you going to parse before you
--- Ben Goertzel [EMAIL PROTECTED] wrote:
As I am sure you are fully aware, you can't parse English without a
knowledge
of the meanings involved. (The council opposed the demonstrators because
they (feared/advocated) violence.) So how are you going to learn
meanings
before you can
Yes, of course, but no human except an expert in lunar astronomy would have
a definitive answer to the question either
The issue at hand is really how a text-analysis based AGI would distinguish
literal from metaphoric text, and how it would understand the context in which
a statement is
--- Ben Goertzel [EMAIL PROTECTED] wrote:
I note also that a web-surfing AGI could resolve the color of the moon
quite easily by analyzing online pictures -- though this isn't pure
text mining, it's in the same spirit...
Not really. You can get a better answer to what color is the moon? if
OK, imagine a lifetime's experience is a billion symbol-occurences. Imagine
you have a heuristic that takes the problem down from NP-complete (which it
almost certainly is) to a linear system, so there is an N^3 algorithm for
solving it. We're talking order 1e27 ops.
Now using HEPP = 1e16 x 30
On Wed, Feb 20, 2008 at 4:27 PM, J Storrs Hall, PhD [EMAIL PROTECTED] wrote:
OK, imagine a lifetime's experience is a billion symbol-occurences. Imagine
you have a heuristic that takes the problem down from NP-complete (which it
almost certainly is) to a linear system, so there is an N^3
On 2/21/08, Ben Goertzel [EMAIL PROTECTED] wrote:
Feeding all the ambiguous interpretations of a load of sentences into
a probabilistic
logic network, and letting them get resolved by reference to each
other, is a sort of
search for the most likely solution of a huge system of simultaneous
A PROBABILISTIC logic network is a lot more like a numerical problem than a
SAT problem.
On Wednesday 20 February 2008 04:41:51 pm, Ben Goertzel wrote:
On Wed, Feb 20, 2008 at 4:27 PM, J Storrs Hall, PhD [EMAIL PROTECTED]
wrote:
OK, imagine a lifetime's experience is a billion
Not necessarily, because
--- one can encode a subset of the rules of probability as a theory in
SMT, and use an SMT solver
-- one can use probabilities to guide the search within an SAT or SMT solver...
ben
On Wed, Feb 20, 2008 at 5:00 PM, J Storrs Hall, PhD [EMAIL PROTECTED] wrote:
A
Ben Goertzel [EMAIL PROTECTED] wrote: On Wed, Feb 20, 2008 at 4:27 PM, J
Storrs Hall, PhD wrote:
OK, imagine a lifetime's experience is a billion symbol-occurences. Imagine
you have a heuristic that takes the problem down from NP-complete (which it
almost certainly is) to a linear system,
To get back to Ben's statement: Is the computer chip industry happy with
contemporary SAT solvers
Well they are using them, but of course there is loads of room for improvement!!
or would a general solver that is capable of
beating n^4 time be of some use to them? If it would be useful, then
And I seriously doubt that a general SMT solver +
prob. theory is going to beat a custom probabilistic logic solver.
My feeling is that an SMT solver plus appropriate subsets of prob theory
can be a very powerful component of a general probabilistic inference
framework...
I can back this up
It's probably not worth too much taking this a lot further, since we're
talking in analogies and metaphors. However, it's my intuition that the
connectivity in a probabilistic formulation is going to produce a much denser
graph (less sparse matrix) than what you find in the SAT problems that
Adaptive A.I. Inc is looking for Entry-Level AI Psychologist
As all of our current staff members (now up to 17) are now quite experienced
and highly productive, we are again looking to fill a (Los Angeles based)
full-time, entry-level position.
We want someone smart, and highly motivated
24 matches
Mail list logo