[agi] Re: Isn't "Pooling" basically "Activation Function" !?
Actually I agree with that "pooling" is like a type of transformation function, or a type of specific convolutional kernel with specific fixed weights instead of being updated by BP. The "activation function" you described above is more like, in my opinion, a filter that pass the big objects and ignore small details, which is also similar with meaning of "convolution". More common view on "pooling" is that it is a "down sampling", to filter the information as attention mechanism. By the way, there are many different pooling ways, in which "max pooling" is one of the most common used. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T7d826b6cfc147f30-Mc814567e7c0cb5b6f38af373 Delivery options: https://agi.topicbox.com/groups/agi/subscription
[agi] Re: Isn't "Pooling" basically "Activation Function" !?
Don't any of you even know or have an opinion? .. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T7d826b6cfc147f30-Md26b5302b5203aede2141a1b Delivery options: https://agi.topicbox.com/groups/agi/subscription
Re: [agi] Mindlessness.
The damn good thing is that even if our economy and hardware twindle, humans still retain the seed/ wisdom. We know how to build nividea computers, AI algorithms, etc. The human genome and a AI algorithm are MUCH smaller than the army force is creates. A human cell's DNA is so tiny. It grows into the body/ economy. It's all there. After millennia of evolution. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Tc32c76a0c85e2ca9-Mbeba926c6baa317715e31614 Delivery options: https://agi.topicbox.com/groups/agi/subscription
Re: [agi] Uhm, Does Consciousness Collapse the Wave Function?
Insects could isomorphically compress into much smaller models. Let's say there's an AGI Demoscene. Anyone ever do Demoscene? 64K AGI I would say allocate 2-4K to consciousnesses synthesis. Assuming an advanced quantum computer system not like the model T's they're pioneering now. Though I just read about someone entangling 15 trillion atoms technology is moving.. Though even a protoAGI classical mathematical model in code you could say that's 64K of pure intelligence but really some of it would be consciousness... and memory. Is memory intelligence? It's a part of it. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T51eb63417278f283-M9f4436e77f6a176fb91fb163 Delivery options: https://agi.topicbox.com/groups/agi/subscription
Re: [agi] Uhm, Does Consciousness Collapse the Wave Function?
On Tue, Jun 9, 2020, 5:48 AM John Rose wrote: > Conscious systems are more efficient than non. Communication is enhanced. > Conscious agents can predict other similarly conscious agents behavior and > make decisions based on those predictions with confidence. > > Take two systems of ants, one natural and the other p zombie robots. Will > the emergent structure of the anthill be as efficient as with the p > zombies? There is a complex systems barrier. > So by consciousness you mean intelligence, as measured by prediction accuracy. This accounts for efficient communication because that depends on predicting what the recipient already knows and only transmitting the difference. Intelligence depends on knowledge and computing power. So now we can address insect consciousness by estimating their intelligence relative to humans. Human brains are a million times larger, so we can assume a million times more operations per second and storage capacity. Human knowledge is on the order of 10^9 bits, half inherited and half learned and stored in long term memory. Insects can't be trained and have less than 100 bits for stuff like bees remembering where they found pollen to tell the hive. Inherited knowledge is about 10-30 times greater. The human genome is 3 Gb. No mammal has less than 2 Gb. Insects range from 91 Mb to over 7 Gb. https://pubmed.ncbi.nlm.nih.gov/21877225/ -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T51eb63417278f283-Md9d8bad4241408b69ad5 Delivery options: https://agi.topicbox.com/groups/agi/subscription
[agi] Re: AGI Guaranteed to Think
Uhm, I don't know how to break the news to ya... MSIE is all but dead. Might want to go with Edge or Chrome or something. Just a heads up. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T5c4cba19fc03e7ef-Ma03bb46851214485c2450acd Delivery options: https://agi.topicbox.com/groups/agi/subscription
[agi] AGI Guaranteed to Think
Ghost in the Machine has been renamed "Ghost Strong AI Guaranteed to Think with MSIE" to stimulate attempts at rebuttal from Netizens who doubt such a claim. To prove the claim false, one must define what thinking is, and then see if the Ghost.html achieves satisfaction of the definition of thinking. If thinking is defined as the mental association from concept to concept and the expression of the resulting idea in a natural language such as English or Latin or Russian, the it becomes quite obvious that the claim is true, because the JavaScript AI forms concepts and discusses ideas in various human languages. http://ai.neocities.org/EnThink.html -- JavaScript AI thinking in English. http://ai.neocities.org/LaThink.html -- JavaScript AI thinking in Latin. http://ai.neocities.org/RuThink.html -- JavaScript AI thinking in Russian. https://groups.google.com/d/msg/comp.lang.javascript/xbRzUDuPyXQ/tBSVXvObAgAJ -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T5c4cba19fc03e7ef-Mcdbf63f8f76aecfd8028dc67 Delivery options: https://agi.topicbox.com/groups/agi/subscription
Re: [agi] Uhm, Does Consciousness Collapse the Wave Function?
Gods may be emergent structures in the complex systems of societies. Essentially state machines that govern individual behavior to some extent automating some of our decisions until... And these emergent gods go through cycles as mortals rebel. Then there's the gods god.. wait, this was supposed to be a pursuit of wave function collapse WTF? ... maybe AGI will be a Gods' God. AGGI :) -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T51eb63417278f283-Maa05b5e770f1c1677ce24296 Delivery options: https://agi.topicbox.com/groups/agi/subscription
Re: [agi] Uhm, Does Consciousness Collapse the Wave Function?
So that's why I hear a cheerful Ho Ho Ho over my house during the winter. All things are alive, not just animals. It's up to you what you want to murder and up to them to stop you. Thinking in a brain or computer doesn't do anything different than an ordinary algorithm but what the swarm does as of a result. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T51eb63417278f283-M69668048b843d3ea2d7e590c Delivery options: https://agi.topicbox.com/groups/agi/subscription
Re: [agi] Uhm, Does Consciousness Collapse the Wave Function?
> An observer is any measuring device with at least one bit of memory. This -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T51eb63417278f283-M6ac303278fbe808a95848dc9 Delivery options: https://agi.topicbox.com/groups/agi/subscription
Re: [agi] Uhm, Does Consciousness Collapse the Wave Function?
On Tue, 9 Jun 2020, John Rose wrote: Miles Davis said that the gods don't punish people by not giving them what they want. The gods punish people, he said, by giving them what they want and then not giving them time. Cool, I hadn't heard that quote. Could just be that the gods punish people by giving them what they want. And that's why the gods are laughing. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T51eb63417278f283-M83d526388d18d43ecc696e6d Delivery options: https://agi.topicbox.com/groups/agi/subscription
Re: [agi] Uhm, Does Consciousness Collapse the Wave Function?
Conscious systems are more efficient than non. Communication is enhanced. Conscious agents can predict other similarly conscious agents behavior and make decisions based on those predictions with confidence. Take two systems of ants, one natural and the other p zombie robots. Will the emergent structure of the anthill be as efficient as with the p zombies? There is a complex systems barrier. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T51eb63417278f283-M67ddb5a0a27257a635b22c49 Delivery options: https://agi.topicbox.com/groups/agi/subscription