Ben Goertzel wrote:
However, it shouldn't be hard for AGIs to avoid the particularly simple
and glaring examples of conjunction fallacy that have been made famous
in the cognitive psychology literature...
Some of them, but not others. For an example of the more difficult case:
**
Two independent sets of professional analysts at the Second
International Congress on Forecasting were asked to rate, respectively,
the probability of "A complete suspension of diplomatic relations
between the USA and the Soviet Union, sometime in 1983" or "A Russian
invasion of Poland, and a complete suspension of diplomatic relations
between the USA and the Soviet Union, sometime in 1983". The second set
of analysts responded with significantly higher probabilities.
**
This is a type of conjunction fallacy where, arguably, an AI can beat a
human in this specific case, but only by expending more computing power
to search through many possible pathways from previous beliefs to the
conclusion. In which case, given a more complex scenario, one that
defeated the AI's search capabilities, the AI would fail in a way
essentially analogous to the human who conducts almost no search.
--
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303