(see: irc.racrew.us)

On 9/7/08, Eric Burton <[EMAIL PROTECTED]> wrote:
> Oh, thanks for helping me get this off my chest, everyone. If I ever
> finish the thing I'm definitely going to freshmeat it. I think this
> kind of bot, which is really quite trainable, and creative to boot --
> it falls back to a markov chainer -- could be a shoe-in for
> naturalistic NPC dialogue in games. Just disable learning new phrases
> but keep some level of mood assessment and phrase mutation and it
> should functionally never become annoying.
>
> Obviously lacking real cognitive processes means that Bootris is not a
> general intelligence, but as an interactive curiousity who craves
> human acceptance/language data, he is a fair way to accrue a large
> corpus of online conversation for later mining and transforms.
>
> I will give an example of one use he's suited to today. With a
> cleaned-out markov cloud I took the bot to an IRC net populated by
> international botnet jockeys and their scanning/spamming bots. Within
> a minute or two the bot was making interjections to a dozen channels
> of two distinct natures... colour-coded replies like those from the
> bots, and commands to run scans of his own. Very disruptive!
>
> I almost put the code on sourceforge right away when I saw that
> happen, but it really was not finished.
>
> Ok, that's all.
>
>
> On 9/7/08, Eric Burton <[EMAIL PROTECTED]> wrote:
>> One thing I think is kind of notable is that the bot puts everything
>> it says, including phrases that are invented or mutated, into a
>> personality database or list of possible favourite phrases, then takes
>> six-axis mood assessments of follow-ups to its interjections, uses
>> them to modify a mean score for the phrase, and prunes or clones it
>> accordingly. This list can be searched a lot faster than the list of
>> every unique phrase the bot has seen, and should statistically come to
>> contain mostly phrases that make people like it. However, at 1GHz
>> ConceptNet's mood assessment method is prohibitively slow...
>>
>> I haven't moved on to the context sensitivity and common-sense stuff
>> that's in there. The natural-language module (ConceptNetNLTools)
>> contains everything I'm using and seems to take over 100M in RAM
>> alone. ConceptNetDB though seems to be worth opening up next.
>>
>> By using irclib with ConceptNet (both for Python) I can let the bot
>> accrue a potentially unlimited database of up-to-date phrases, indexed
>> by chronology and unique parts of speech, and from them extrapolate
>> salient replies. Since the process is novelty-seeking, I think you'd
>> reach a point where the training corpus ceases to expand except for
>> current events and new terms. Whether this would take 4G or 40G of RAM
>> I can't say yet, but the process obviously is not fast.
>>
>> The bot's heartbeat is incoming messages on the channels it's on, and
>> it doesn't posess faculties for reflection or induction. By mimicking
>> humans and watching the moods of people around it to assess its
>> success and modify its behaviour, it ought to be able to pass as human
>> without having most of the internal processes that characterize one...
>>
>> I don't know if there's a lesson here.
>>
>> Eric B
>>
>>
>> On 9/7/08, Eric Burton <[EMAIL PROTECTED]> wrote:
>>> --- snip ---
>>>
>>> [1220390007]<<< receive << [EMAIL PROTECTED] <
>>> bootris, invoke mathematica
>>>
>>> [1220390013]>>> told >> #love > cool hand luke is like a comic heroic
>>> jesus
>>>
>>> [1220390034]<<< receive << [EMAIL PROTECTED] <
>>> bootris, solve russell's paradox
>>>
>>> [1220390035]>>> told >> #love >  invoke mathematica
>>>
>>> [1220390066]<<< receive << [EMAIL PROTECTED] <
>>> he's invoking mathematica
>>>
>>> [1220390089]<<< receive << [EMAIL PROTECTED] <
>>> he's invoking mathematica. bootris, solve russell's paradox
>>>
>>> [1220390090]>>> told >> #love >  solve russell's paradox
>>>
>>> [1220390096]<<< receive << [EMAIL PROTECTED] <
>>> he's invoking mathematica. bootris, solve russell's paradox. bootris,
>>> yes
>>>
>>> [1220390097]>>> told >> #love > Or make her laugh then tell her shes
>>> not good for when you say that like its going to learn islenska.
>>>
>>> --- snip ---
>>>
>>> Honestly it wasn't trivial getting to this stage
>>>
>>
>


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=111637683-c8fa51
Powered by Listbox: http://www.listbox.com

Reply via email to