Re: Re: Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-11-21 Thread Matt Mahoney via AGI
Both agents have the same complexity after training but not before. On Wed, Nov 21, 2018, 1:24 AM ducis > > Forgive me for not understanding the Legg paper completely, but > how would you separate a 1MB "AI agent" executable plus a 1PB file of > trained model (by "sucking data from internet"),

Re:Re: Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-11-20 Thread ducis
Forgive me for not understanding the Legg paper completely, but how would you separate a 1MB "AI agent" executable plus a 1PB file of trained model (by "sucking data from internet"), from a 1PB executable compiled from manually built source code? I don't see how the latter can be classified

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-11-19 Thread Taylor Stempo via AGI
Black ops the first time since we have been able to ohmm-- th9iught you were mathoneg#/ On Mon, Nov 19, 2018, 2:49 PM Taylor Stempo Love starting to read this... just started,. > > Flamxotr > > On Sun, Sep 9, 2018, 12:42 PM John Rose >> How I'm thinking lately (might be totally wrong,

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-28 Thread Nanograte Knowledge Technologies via AGI
And your words remind me of polar pulsation in context of thesis and antithesis. As a superpattern, the Torus seems truly [content] independent, a singularity. From: John Rose Sent: Friday, 28 September 2018 11:37 AM To: 'AGI' Subject: RE: [agi] E=mc^2 Morphism

RE: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-28 Thread John Rose
> -Original Message- > From: Nanograte Knowledge Technologies via AGI > > John. considering eternity, what you described is but a finite event. I dare > say, > not only consciousness, but cosmisity. > Until one comes to terms with their true insignificance will they not grasp their

RE: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-28 Thread John Rose
> -Original Message- > From: Jim Bromer via AGI > > John, > Can you map something like multipartite entanglement to something more > viable in contemporary computer programming? I mean something simple > enough that even I (and some of the other guys in this group) could > understand? Or

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-27 Thread Nanograte Knowledge Technologies via AGI
John. considering eternity, what you described is but a finite event. I dare say, not only consciousness, but cosmisity. Rob From: Jim Bromer via AGI Sent: Thursday, 27 September 2018 7:29 PM To: AGI Subject: Re: [agi] E=mc^2 Morphism Musings... (Intelligence

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-27 Thread Jim Bromer via AGI
John, Can you map something like multipartite entanglement to something more viable in contemporary computer programming? I mean something simple enough that even I (and some of the other guys in this group) could understand? Or is there no possible model that could be composed from contemporary

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-27 Thread Matt Mahoney via AGI
Gravity and other laws of physics are explained by the anthropogenic principle. The simplest explanation by Occam's Razor is that all possible universes exist and we necessarily observe one where intelligent life is possible. On Thu, Sep 27, 2018, 5:32 AM Jim Bromer via AGI wrote: > Science

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-27 Thread Jim Bromer via AGI
Science does not have a good theory about what causes gravity. You can deny it and say that science has explained gravity. Mass 'causes' gravity. Would you conclude that gravity does not exist because it is actually only mass? Or you come up with something like: mass is just the interruption of

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-25 Thread Jim Bromer via AGI
I want to try to have a more positive attitude about other people's crackpot ideas. It is taking me a few days to understand what people are saying or even why people are motivated to talk about the inexplicable experience of consciousness in an AI discussion group. But I will take some time off

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-25 Thread Jim Bromer via AGI
I apologize for making personal attacks. I did not mean my comments to come out that way. I think there are a number of native American tribes who believe that the spirit imbues everything and every where. I do not actually disagree with that. However, that does not mean that the spirit of a rock

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-24 Thread Jim Bromer via AGI
ment of knowing. > Esoterically, I'd say qualia is that absolute moment when individual, > consciousnessintelligence potential is realized. > > Computational models already exist for most of the components and > functionality I mentioned. As such, I think it has total relevance for the > step-b

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-24 Thread Nanograte Knowledge Technologies via AGI
an AGI model. Thoughts? Rob From: Jim Bromer via AGI Sent: Monday, 24 September 2018 8:02 PM To: AGI Subject: Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?) Matt's response - like an adolescent's flip remark - is evidence of the kind of denial that I mention

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-24 Thread Jim Bromer via AGI
Matt's response - like an adolescent's flip remark - is evidence of the kind of denial that I mentioned. Jim Bromer On Mon, Sep 24, 2018 at 10:49 AM Matt Mahoney via AGI wrote: > > I wrote a simple reinforcement learner which includes the line of code: > > printf("Ouch!\n"); > > So I don't see

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-24 Thread Matt Mahoney via AGI
I wrote a simple reinforcement learner which includes the line of code: printf("Ouch!\n"); So I don't see communication of qualia as a major obstacle to AGI. Or do you mean something else by qualia? On Mon, Sep 24, 2018, 5:21 AM John Rose wrote: > > -Original Message- > > From: Matt

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-24 Thread Jim Bromer via AGI
John, There are aspects of the intelligent understanding of the world (universe of things and ideas) that can be modelled and simulated. I think this is computable in an AI program except the problem of complexity would slow the modelling down so much that it would not be effective enough (at this

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-24 Thread Jim Bromer via AGI
ce explains that your brain runs a program." vs So does >> philosophy. Are they both equally correct, therefore philosophy = science? >> >> Inter alia, you still did not explain anything much, did you? >> >> Rob >> -- >&g

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-23 Thread Jim Bromer via AGI
to do the semantic work. Or is it symbolic of another > problem? I think it's a very brave thing to talk publicly about a subject > we all agree we seemingly know almost nothing about. Yet, we should at > least try to do that as well. > > Therefore, to explain is to know? > &g

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-23 Thread Nanograte Knowledge Technologies via AGI
From: Matt Mahoney via AGI Sent: Sunday, 23 September 2018 4:02 PM To: AGI Subject: Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?) Science doesn't explain everything. It just tries to. It doesn't explain why the universe exists. Philosop

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-23 Thread Jim Bromer via AGI
we should at least try to >> do that as well. >> >> Therefore, to explain is to know? >> >> Rob >> ____________________ >> From: Jim Bromer via AGI >> Sent: Saturday, 22 September 2018 6:12 PM >> To: AGI >> Subject: Re: [agi] E=mc^2 Morph

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-23 Thread Matt Mahoney via AGI
t as well. > > Therefore, to explain is to know? > > Rob > -- > *From:* Jim Bromer via AGI > *Sent:* Saturday, 22 September 2018 6:12 PM > *To:* AGI > *Subject:* Re: [agi] E=mc^2 Morphism Musings... > (Intelligence=math*consciousness^2 ?) > &

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-22 Thread Jim Bromer via AGI
The theory that contemporary science can explain everything requires a fundamental denial of history and a kind of denial about the limits of cotemporary science. That sort of denial of common knowledge is ill suited for adaptation. It will interfere with your ability to use scientific method. Jim

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-22 Thread Jim Bromer via AGI
Qualia is what perceptions feel like and feelings are computable and they condition us to believe there is something magical and mysterious about it? This is science fiction. So science has already explained Chalmer's Hard Problem of Consciousness. He just got it wrong? Is that what you are

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-22 Thread Jim Bromer via AGI
Let's say that someone says that quantum effects can explain qualia. I might respond by saying that sort of speculation is not related to contemporary computer science. Then I get the reply, What do you mean?!! Computers are used heavily in quantum science Yes, so computers are used to make

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-22 Thread Jim Bromer via AGI
But you are still missing the definition of qualia. Wikipedia has a thing on it and I am sure SEP does as well. Because there are reports of subjective experience we know that we share something of the nature of experience. Common sense can tell us that computers do not. How do we know that

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-21 Thread Nanograte Knowledge Technologies via AGI
of my knowledge base? I'm now setting the threshold to zero. Rob From: Matt Mahoney via AGI Sent: Saturday, 22 September 2018 2:28 AM To: AGI Subject: Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?) John answered the question. Qualia

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-21 Thread Matt Mahoney via AGI
John answered the question. Qualia = sensory input compressed for communication. A thermostat has qualia because it compresses its input to one bit (too hot/too cold) and communicates it to the heater. On Fri, Sep 21, 2018, 2:00 PM Jim Bromer via AGI wrote: > > From: Matt Mahoney via AGI > > >

RE: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-19 Thread John Rose
> -Original Message- > From: Matt Mahoney via AGI > > What do you think qualia is? How would you know if something was > experiencing it? > You could look at qualia from a multi-systems signaling and a compressionist standpoint. They're compressed impressed samples of the environment

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-13 Thread Matt Mahoney via AGI
On Thu, Sep 13, 2018, 12:12 PM John Rose wrote: > > -Original Message- > > From: Matt Mahoney via AGI > > > > We could say that everything is conscious. That has the same meaning as > > nothing is conscious. But all we are doing is avoiding defining > something that is > > really hard

RE: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-13 Thread John Rose
> -Original Message- > From: Matt Mahoney via AGI > > We could say that everything is conscious. That has the same meaning as > nothing is conscious. But all we are doing is avoiding defining something > that is > really hard to define. Likewise with free will. I disagree. Some things

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-13 Thread Matt Mahoney via AGI
We could say that everything is conscious. That has the same meaning as nothing is conscious. But all we are doing is avoiding defining something that is really hard to define. Likewise with free will. We will know we have properly modeled human minds in AGI if it claims to be conscious and have

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-12 Thread Nanograte Knowledge Technologies via AGI
Rob From: Matt Mahoney via AGI Sent: Tuesday, 11 September 2018 11:05 PM To: AGI Subject: Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?) On Mon, Sep 10, 2018 at 3:45 PM wrote: > You believe! Showing signs of communication protocol with futu

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-11 Thread Matt Mahoney via AGI
On Mon, Sep 10, 2018 at 3:45 PM wrote: > You believe! Showing signs of communication protocol with future AGI :) an > aspect of CONSCIOUSNESS? My thermostat believes the house is too hot. It wants to keep the house cooler, but it feels warm and decides to turn on the air conditioner. I

RE: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-10 Thread John Rose
> -Original Message- > From: Russ Hurlbut via AGI > > 1. Where do you lean regarding the measure of intelligence? - more towards > that of Hutter (the ability to predict the future) or towards > Winser-Gross/Freer > (causal entropy - soft of a proxy for future opportunities; ref >

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-10 Thread johnrose
> -Original Message- > From: Matt Mahoney via AGI >... Yes, I'm familiar with these algorithmic information theory *specifics*. Very applicable when implemented in isolated systems... > No, it (and Legg's generalizations) implies that a lot of software and > hardware > is required and

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-10 Thread Russ Hurlbut via AGI
John - Thanks for a refreshingly new discussion for this forum. Just as you describe, it is quite interesting to see how seemingly disparate tracks can be combined and guided onto the same course. Accordingly, your presentation has brought to mind similar notions that appear to fit somewhere

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-10 Thread Matt Mahoney via AGI
On Mon, Sep 10, 2018 at 8:10 AM wrote: > Why is there no single general compression algorithm? Same reason as general > intelligence, thus, multi-agent, thus inter agent communication, thus > protocol, and thus consciousness. Legg proved that there are no simple, general theories of

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-10 Thread Nanograte Knowledge Technologies via AGI
ay, 10 September 2018 2:44 PM To: AGI Subject: Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?) Nanograte, > In particular, the notion of a universal communication protocol. To me it > seems to have a definite ring of truth to it. It does doesn't it?! For years

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-10 Thread johnrose
Nanograte, > In particular, the notion of a universal communication protocol. To me it > seems to have a definite ring of truth to it. It does doesn't it?! For years I've worked with signaling and protocols lending some time to imagining a universal protocol. And for years I've thought about

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-10 Thread johnrose
Matt, Zoom out. Think multi-agent not single agent. Multi-agent internally and externally. Evaluate this proposition not from first-person narrative and it begins to make sense. Why is there no single general compression algorithm? Same reason as general intelligence, thus, multi-agent, thus

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-10 Thread Nanograte Knowledge Technologies via AGI
to it. Please carry on as you are doing now... From: johnr...@polyplexic.com Sent: Monday, 10 September 2018 12:56 PM To: AGI Subject: Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?) Matt: > AGI is the very hard engineering problem of mak

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-10 Thread Mark Nuzz via AGI
I'll take jargon salad over buzzword soup any day. On Sun, Sep 9, 2018 at 3:26 PM Matt Mahoney via AGI wrote: > Recipe for jargon salad. > > Two cups of computer science. > One cup mathematics. > One cup electrical engineering. > One cup neuroscience. > One half cup information theory. > Four

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-09 Thread Matt Mahoney via AGI
AGI is the very hard engineering problem of making machines do all the things that people can do. Consciousness is not the magic ingredient that makes the problem easy. On Sep 9, 2018 10:08 PM, wrote: Basically, if you look at all of life (Earth only for this example) over the past 4.5 billion

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-09 Thread johnrose
Basically, if you look at all of life (Earth only for this example) over the past 4.5 billion years, including all the consciousness and all that “presumed” entanglement and say that's the first general intelligence (GI) the algebraic structural dynamics on the computational edge... is

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-09 Thread Matt Mahoney via AGI
Recipe for jargon salad. Two cups of computer science. One cup mathematics. One cup electrical engineering. One cup neuroscience. One half cup information theory. Four tablespoons quantum mechanics. Two teaspoons computational biology. A dash of philosophy. Mix all ingredients in a large bowl.

Re: [agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-09 Thread Jim Bromer via AGI
Consciousness computation (GI) is on the negentropic massive multi-partite entanglement frontier of a spontaneous morphismic awareness complexity - IOW on the edge of life’s consciousness based on manifestation of inter/intra-agent entanglement (in DNA perhaps?). Whoa! I'm roiling dude. I mean,

[agi] E=mc^2 Morphism Musings... (Intelligence=math*consciousness^2 ?)

2018-09-09 Thread John Rose
How I'm thinking lately (might be totally wrong, totally obvious, and/or totally annoying to some but it’s interesting): Consciousness Oriented Intelligence (COI) Consciousness is Universal Communications Protocol (UCP) Intelligence is consciousness manifestation AI is a computational