you know, there are controlled natural languages that accomplish the same
stuff with first order logic.
http://en.wikipedia.org/wiki/Attempto_Controlled_English


On Mon, Feb 4, 2013 at 9:36 PM, Douglas Solomon <[email protected]>wrote:

> All I can say is:
>
> "Dammit:  I LOVE reading Arthur's posts!" :)
>
> --Doug Solomon
>
> On 12/31/2012 10:16 AM, A. T. Murray wrote:
> > Let's outsource the Singularity to the Krauts,
> > shall we? Get some Fahrvergnugen in there,
> > enjoy some Gemutlichkeit while we code AI.
> >
> > It's not like they're going to take over the world
> > just becuse we gave them the UberMind Source Code?
> > No, let's release it to all nations simultaneously.
> > If the Germans build it faster, stronger, higher,
> > it's because of the German work ethic, on which
> > the rest of us get to take a free ride, Angela.
> >
> > Ben, the Dr. Merkwurdig-Liebe of AGI, is trying to
> > hand the Singularity over to the slave-labor Chinese,
> > who enrich American investors in Apple while
> > themselves dying of factory dust explosions.
> >
> > Let's just put the Singularity up for grabs
> > by whoever wants the dang thang. Accordingly,
> > we have begun to include "Task" items in AI code:
> > \ Task: Make InFerence work also with pronouns and antecedents;
> > \ Task: Make InFerence work with ideas negated by "NOT".
> > But first for a little background of developments
> > over the past two weeks in open-source AI coding:
> > On mon17dec2012 we suddenly realized how to code
> >
> > http://code.google.com/p/mindforth/wiki/InFerence
> >
> > and by the next morning we had whipped out the first
> > working model of machine reasoning with AI inference.
> > Then a funny thing happened on the way to the Singularity,
> > Zero, and if you get this arcane allusion you are smarter'n
> > most and we salute you for your Watson wit and vast knowledge.
> > Any takers? No? Ah, yez guys are all a bunch of zeroes.
> >
> > The funny thing was that with syllogistic machine reasoning,
> > people who had previously dismissed Mentifex AI suddenly
> > began to sit up and take notice, as if to say, "Why,
> > the gobsmack Fuji actually created something." People
> > don't normally talk that way, but, present company here
> > excluded, people don't normally spend their lives creating
> > the impossible to do the unimagineable.
> >
> > So first we got the MindForth AI to take two facts and
> > infer from them a third fact. "BIRDS HAVE WINGS"; check.
> > "Eagles are birds"; check. "EAGLES HAVE WINGS"; BOING!!!
> >
> > Then we started feeding the silent inferences into the
> > http://code.google.com/p/mindforth/wiki/AskUser module
> > to ask a yes-or-no question in confirmation of the idea.
> > The AI asked the human user, "DO EAGLES HAVE WINGS?"
> >
> > We retractively adjusted the knowledge base with
> > http://code.google.com/p/mindforth/wiki/KbRetro
> > which changes the associative connections among
> > the concepts of the question being asked, so that
> > a "No" answer results in the negation of the idea.
> > And that's where we stand today at the end of 2012.
> >
> > Mentifex (Arthur)
>
>
>
> -------------------------------------------
> AGI
> Archives: https://www.listbox.com/member/archive/303/=now
> RSS Feed: https://www.listbox.com/member/archive/rss/303/5037279-a88c7a6d
> Modify Your Subscription:
> https://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to