At 15:56 17.11.02 -0500, Ben wrote:
>...
>a) learning about language (how to comprehend & produce it)

We are using data from our human2human chat rooms, this data is used to
train a hidden markov model supertagger.
We train 2 things normal utterances and discourse.

It learnes about things a user tells, his job, where he comes from, his
mood etc. We try to cluster this and find correlations. eg. young girl like
horses and a horse has a name. Next time an other girl talks about her
horse, the bot asks about horse-name ... 

We try to transfer simple things into a MulitNet-Database with probability
info (like NARS)
A dolphin is a fish
(dolphin SUB fish) + (* CTXT <folk theory>)
Its easy to translate this MultiNet into other languages:
Der Delphin ist ein Fisch.

We use the positive/negative feedback of the human chatter to rate the
bot-answers.
(dophin SUB fish) becomes
(dolphin SUB fish) + (* CTXT <folk theory>)

This works for chat smalltalk.

We are cheating with parsers and AIML, because the bot learning without
cheating is too boring for chatters. We try to reduce the cheating (aka
narrow-AI or AGI-ish tools).

We can feed easy texts for kids into the system, many sentences are parsed
correctly and few are transfered correctly into MultiNet-DB. We try to get
a feedback from chatters, thats supervised learning by chatters.

For a new similar languange the bootstrap-process is much easier. 

> ...
>idiotic human behavior is another question.  And whether there's a business
>point is yet another question... presumably you've found that there is!]

Earning money with this is imho better than search for funding and more fun.

>A USA Today article is a whole different matter.  ...

ACK

>Could a narrow-AI program produce English translations of foreign news
>articles that were worth reading?  Probably, though no tech is there yet --
>but that's an easier problem.  Structural issues and choices of what to say
>are carried over from language to language, and a good reader can ignore
>occasional mistranslations & continual stylistic infelicities.

A systran-translation from german Bildzeitung to english is funny but
readable.

Leilas Alltag ist anstrengend. Fast 70 Babys und deren M�tter hat sie
zusammen mit ihren Kollegen betreut, bei mehr als 50 Geburten war sie
selbst dabei. Leila will, dass die M�tter sich ohne Druck dar�ber klar
werden d�rfen, ob sie ihre Babys zur Adoption freigeben oder ob es nicht
doch einen anderen Weg gibt. Mit Erfolg: 60 Prozent der M�tter entscheiden
sich f�r ihr Kind.

Leilas everyday life is arduous. It cared for almost 70 babies and their
mothers together with its colleagues, participated with more than 50 births
it. Leila wants the fact that the mothers without printing over it to
become clear to be allowed itself whether they release their babies for
adoption or whether it gives another way not nevertheless. With success: 60
per cent of the mothers decide for their child.

This translation has some problems, not so difficult for a narrow-AI to solve.

Druck = pressure (not printing)

Using short sentences gives good results, now its time to make the
sentences longer. More world-knowledge is needed.

cu Alex

-------
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/

Reply via email to