Bryan Bishop wrote:
On Monday 12 November 2007 19:31, Richard Loosemore wrote:
Yikes, no:  my strategy is to piggyback on all that work, not to try
to duplicate it.

Even the Genetic Algorithm people don't (I think) dream of evolution
on that scale.

Yudkowsky recently wrote an email on "preservation of the absurdity of the future." The method that I have proposed requires this massive international effort and maybe can only be started when we hit a few more billion births. It is not entirely absurd, however, since we would start the project with investigation methods known today and slowly improve until we have millions of people researching the millions of varied pathways in the brain. From what I have read of Novamente today, Goertzel might be hoping that the circuits in the brain are ultimately simple, or that some similar model that has simpler components building up to some greater actor-exchange medium, effectively mimics the brain to some degree.

Yudkowsky's ramblings don't cut much ice with me.

Ben is not so much interested in whether the circuits (mechanisms) in the brain are simple or not, since he belongs to the school that believes that AGI does not need to be done exactly the way the human mind does it.

I, on the other hand, believe that we must stick fairly closely to an emulation of the *cognitive* level (not neural, but much higher up).

Even with everyone on the planet running evolutionary simulations, I do not believe we could reinvent an intelligent system by brute force.


Richard Loosemore

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=64425716-c7bb52

Reply via email to