Yes. There is quite some evidence. Here, I wrote a paper about the membrane
proteins likely involved:
https://www.sciencedirect.com/science/article/abs/pii/S1476927123000117?dgcid=author

If you do not have access to it, let me know and I can send you a copy.

Danko


Dr. Danko Nikolić
www.danko-nikolic.com
https://www.linkedin.com/in/danko-nikolic/
-- I wonder, how is the brain able to generate insight? --


On Wed, Oct 18, 2023 at 10:33 PM Matt Mahoney <[email protected]>
wrote:

> Do you have any experimental results with temporary connections in
> neural networks? It's an interesting idea but it needs to be tested for
> prediction accuracy on some benchmarks.
>
> On Wed, Oct 18, 2023, 2:56 PM Danko Nikolic <[email protected]>
> wrote:
>
>> Hi Matt,
>>
>> First, you should have gotten an A on your paper for such progressive
>> ideas at the time.
>>
>> Second, if you ask these questions, this means that I have completely
>> failed to explain my proposal. Had I managed to explain the idea, then you
>> would have not asked those questions.
>>
>> So, Hebb's rule is about plasticity or in other words, about learning.
>> What I am proposing is not about plasticity or learning. This is where the
>> difference lies. The gates open and close connections but only transiently
>> for example, only for one second and then they go back to their default
>> state. This is something different from synapses and in addition to
>> synapses/weights.
>>
>> This is a completely new type of a mechanism, or a new level of
>> complexity that did not exist before.
>>
>> I hope this helps.
>>
>> Danko
>>
>>
>> Dr. Danko Nikolić
>> www.danko-nikolic.com
>> https://www.linkedin.com/in/danko-nikolic/
>> -- I wonder, how is the brain able to generate insight? --
>>
>>
>> On Wed, Oct 18, 2023 at 5:22 PM Matt Mahoney <[email protected]>
>> wrote:
>>
>>> How is your proposal different from Hebb's rule? I remember reading in
>>> the 1970s as a teenager about how neurons represent mental concepts and
>>> activate or inhibit each other through 2 kinds of synapses. I had the idea
>>> that synapses would change states in the process of forming memories. At
>>> the time it was unknown that synapses would do this. In 1980 I described
>>> this model of classical conditioning in my freshman psychology class. I got
>>> a B on the paper. Years later I learned that Hebb proposed the same idea in
>>> 1949.
>>>
>>> Connectionism is a simple idea that makes it easy to understand how
>>> brains work by learning associations between concepts, but it lacks a
>>> mechanism for adding new concepts. That problem is solved by representing
>>> concepts as linear combinations of neurons, but it makes a neural network
>>> more like a black box of inscrutable matrices.
>>>
>>> Your diagram shows a feed forward network, but in reality there are
>>> connections going in all directions. Lateral inhibition within the same
>>> layer gives you a winner take all network as the mechanism for attention in
>>> a transformer. Positive feedback loops give you short term memory. Negative
>>> feedback gives you logarithmic scaling of sensory perceptions.
>>>
>>> On Wed, Oct 18, 2023, 10:14 AM Danko Nikolic <[email protected]>
>>> wrote:
>>>
>>>> Here is my proposal on what connectionism is missing in order to reach
>>>> a 'true" AI i.e., an AI that is much more similar to how the human brain
>>>> works.
>>>>
>>>> This is a nine-minute video on the secret that was missing:
>>>> https://www.youtube.com/watch?v=GVW6H4iCsTg&ab_channel=dankonikolic
>>>>
>>>> I hope it is clear enough.
>>>>
>>>> Comments and questions are welcome.
>>>>
>>>> Danko
>>>>
>>>>
>>>> Dr. Danko Nikolić
>>>> www.danko-nikolic.com
>>>> https://www.linkedin.com/in/danko-nikolic/
>>>> -- I wonder, how is the brain able to generate insight? --
>>>>
>>> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> +
> delivery options <https://agi.topicbox.com/groups/agi/subscription>
> Permalink
> <https://agi.topicbox.com/groups/agi/T1e8c992894d0432b-Me15697325f9d118cf1be29e5>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T1e8c992894d0432b-M57feef641b35826075264bb3
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to