This post is a brief comment on PJ Manney's interesting essay,

http://www.pj-manney.com/empathy.html

Her point (among others) is that, in humans, storytelling is closely
tied with empathy, and is a way of building empathic feelings and
relationships.  Mirror neurons and other related mechanisms are
invoked.

I basically agree with all this.

However, I would add that among AI's with a nonhuman cognitive
architecture, this correlation need not be the case.  Humans are built
so that among humans storytelling helps build empathy.  OTOH, for an
AI storytelling might not increase empathy one whit.

It is interesting to think specifically about the architectural
requirements that "having storytelling increase empathy" may place on
an AI system.

For example, to encourage the storytelling/empathy connection to exist
in an AI system, one might want to give the system an explicit
cognitive process of hypothetically "putting itself in someone else's
place."  So, when it hears a story about character X, it creates
internally a fabricated story in which it takes the place of character
X.  There is no reason to think this kind of strategy would come
naturally to an AI, particularly given its intrinsic dissimilarity to
humans.  But there is also no reason that kind of strategy couldn't be
forced, with the impact of causing the system to understand humans
better than it might otherwise.

-- Ben G

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=11983

Reply via email to