But, the idea of a multi-layered network is very interesting to me. Not
(just) a layered processing (as in a neural network).

Jim Bromer

On Wed, Jan 28, 2015 at 11:25 PM, Jim Bromer <[email protected]> wrote:

> YKY said:
> "Multi-layer perceptron is a very inspiring case because it made use of
> the sigmoid function to allow differentiation to solve the learning problem
> via gradient descent.  The latter is a very efficient algorithm.  So I'm
> looking in that direction."
> ---------------------------------------------
> But not all learning can take place via gradients. There are too many
> things wrong with the presumption to believe that some kind of network
> learning modality was all that is missing to make it work. I am not just
> being negative.
>
> Jim Bromer
>
> On Wed, Jan 28, 2015 at 4:44 PM, YKY (Yan King Yin, 甄景贤) via AGI <
> [email protected]> wrote:
>
>> On Wed, Jan 28, 2015 at 1:10 AM, Matt Mahoney via AGI <[email protected]>
>> wrote:
>>
>>> YKY, is this building on your last 10 years of work on Genifer? What
>>> problems did you encounter that require a fundamental redesign?
>>>
>>
>> ​In logic-based AI, the central algorithm is proof-search which is
>> combinatorial search.  This cannot be avoided as long as formulas are
>> symbolic and discrete.  Although heuristics can be designed to speed up the
>> search, such techniques are not very inspiring in the sense that they don't
>> use advanced maths.  (Though there is still a possibility that classical AI
>> heuristics are sufficient to bootstrap AGI.)
>>
>> Multi-layer perceptron is a very inspiring case because it made use of
>> the sigmoid function to allow differentiation to solve the learning problem
>> via gradient descent.  The latter is a very efficient algorithm.  So I'm
>> looking in that direction.  Deep learning is fashionable now and it can
>> potentially lead to techniques capable of learning complex structures.
>>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
>> <https://www.listbox.com/member/archive/rss/303/24379807-653794b5> |
>> Modify
>> <https://www.listbox.com/member/?&;>
>> Your Subscription <http://www.listbox.com>
>>
>
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to