I saw on your website you had published several papers but you didn't have links to them. Maybe gating has some advantages over ordinary neural networks, but it is not obvious what they are. Fully connected networks or transformers also implement arbitrary depth. Also, activating a neuron to cutoff or saturation would effectively be gating, wouldn't it?
What about using your design to implement text prediction (measured by compression). This would be something you could do on a small scale and test on existing benchmarks. The technology would apply directly to LLMs. On Sat, Nov 9, 2024, 2:59 AM Danko Nikolic <[email protected]> wrote: > Hi Matt, > > Thank you for the question. I am working on a benchmark comparing vanilla > deep learning to vanilla gating networks making inferences for increasingly > difficult logical operations. I hope to have them soon and will make sure > to post them here. > > One can already access the demo code showing how gating works with logical > operations. The PyTorch code can be accessed here: > https://github.com/Gating-A-Strong-AI-Company/Gating-first-implementation/. > In fact, by tweaking this code, anyone can perform such benchmarks. > > What we do not have, and will not have soon, is an application to more > complex real-life problems to be benchmarked against SoTA. > > This is because an effort is required to implement learning. Being written > in PyTorch, the code is easy to scale to large networks. With the existing > code, it is trivially easy to create a large network with many more neurons > and gates, being only limited by your RAM (thanks to the PyTorch > community). However, learning has not been implemented yet because these > networks learn in different ways from deep learning--namely, gating > networks have to learn cumulatively, starting from simple things and > gradually expanding their knowledge, one step at a time (again, this is how > biological intelligence learns, so gating stays consistent with biological > intelligence in that respect too). The necessary tools do not exist. An > engineering effort will be required to make this process streamlined, as we > need software tools for breaking down the tasks into steps, tracking the > steps of learning, testing them, adding more neurons and gates when the > current system hits a wall, and so on. An elaborate set of tools for such > cumulative learning can be called AI-Kindergarten. I wrote about it before: > http://www.ai-kindergarten.com/. We need a kindergarten for such AI and > the kindergarten requires software tools. > > I can work on the first benchmarks without such tools because simple > logical operations can be implemented manually. I can mentally simulate the > network operations and figure out in my mind which states the gates should > assume once activated and the rules for activating the gates. This is a > great exercise to understand how gating works. I highly recommend the > exercise, using the code above. However, without additional tools, it feels > like writing code in an assembler or, even worse, writing machine code > directly. This is not sustainable. Improvements in tools are needed for > teaching networks and, thus, for making them reach higher levels of > intelligence. Everyone is invited to contribute. I will make multiple > routes for people to get involved either through open source or in the > company. > > Danko > > Dr. Danko Nikolić > CEO, Robots Go Mental > www.robotsgomental.com > www.danko-nikolic.com > https://www.linkedin.com/in/danko-nikolic/ > -- I wonder, how is the brain able to generate insight? -- > > > On Sat, Nov 9, 2024 at 5:49 AM Matt Mahoney <[email protected]> > wrote: > >> Do you have any benchmarks or experimental results? >> On Thu, Nov 7, 2024, 1:44 PM Danko Nikolic <[email protected]> >> wrote: >> >>> Dear all, >>> >>> I think I have figured out what we were missing to reach human or better >>> say, biological, intelligence. And I think I have figured that out a while >>> ago, but I struggled with explaining it to the world. I suppose everyone >>> would agree that if you had a solution to all hard problems of AI and >>> cognitive science, this solution may be so counterintuitive that people >>> would have a hard time understanding you. Hence, my papers that were quite >>> abstract only found resonance with a limited number of people. >>> >>> Well, I have made much better progress since I introduced gating. Now I >>> get people on board quite a bit more easily. It is still not perfect, but >>> the proportion of people interested grows much faster. I think I have made >>> progress in terms of explaining to the world what is it that I am trying to >>> achieve. >>> >>> I invite the AGI community to explore www.gating.ai. There is some >>> additional information that may ring a few bells. Admittedly, there is >>> still a lot of work to be done in order to make my concepts more >>> approachable. But I am making progress, it seems. >>> >>> Also, there is code implementing gating: >>> https://github.com/Gating-A-Strong-AI-Company/Gating-first-implementation/ >>> >>> This code may be what some people may have been missing. Please let me >>> know if you would like to join the effort to build gating-based intelligent >>> systems. I am finding a way to get people involved and rewarded for their >>> contributions. >>> >>> Danko >>> >>> >>> Dr. Danko Nikolić >>> CEO, Robots Go Mental >>> www.robotsgomental.com >>> www.danko-nikolic.com >>> https://www.linkedin.com/in/danko-nikolic/ >>> -- I wonder how the brain is able to generate insight -- >>> >> *Artificial General Intelligence List <https://agi.topicbox.com/latest>* > / AGI / see discussions <https://agi.topicbox.com/groups/agi> + > participants <https://agi.topicbox.com/groups/agi/members> + > delivery options <https://agi.topicbox.com/groups/agi/subscription> > Permalink > <https://agi.topicbox.com/groups/agi/T232f2f385a5389ed-M65e1b097802c368df7e280cc> > ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T232f2f385a5389ed-M84b988680fc36a9211b15d94 Delivery options: https://agi.topicbox.com/groups/agi/subscription
