Well my dismissal of parallel systems might have been a little
exaggerated.  Almost all of us believe that AGI has to be based on some
kind of network, and it does seem like parallelism could be put to good use
for distributed networks.  However, the problem is that these systems
cannot be dependent on flat networks or simplistic three-dimensional
networks and it is because of the possible complications of
multi-dimensional systems (which might be represented by complex potential
for interrelationships on a three dimensional network) which means that the
more complicated cases could and probably would gum up the potential
advantage of using parallel systems. (The advantage would only be low
polynomial for the near future.)

Jim Bromer

On Fri, Jun 29, 2012 at 7:43 AM, Ben Goertzel <[email protected]> wrote:

>
> It seems possible that hybrid digital-analog computing could have some
>> life to it -- maybe someone will launch an APU (Analog Processing Unit)
>> card, similar to GPU cards today.  That would be really cool for some
>> applications, and could help with some AI algorithms.
>>
>
> Recurrent neural nets being an obvious example...
>
> ben g
>
>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/10561250-164650b2> |
> Modify<https://www.listbox.com/member/?&;>Your Subscription
> <http://www.listbox.com>
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-c97d2393
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-2484a968
Powered by Listbox: http://www.listbox.com

Reply via email to