Re: [agi] Simulation

2019-09-28 Thread John Rose
On Saturday, September 28, 2019, at 4:59 AM, immortal.discoveries wrote: > Nodes have been dying ever since they were given life. But the mass is STILL > here. Persistence is futile. We will leave Earth and avoid the sun. You right. It is a sad state of affairs with the environment...the

Re: [agi] Simulation

2019-09-28 Thread immortal . discoveries
You can hear the ratchet's hypertrophy screeching. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Taa86c5612b8739b7-M6cda8589ce6eddb29bb5c3ea Delivery options: https://agi.topicbox.com/groups/agi/subscription

Re: [agi] Simulation

2019-09-28 Thread immortal . discoveries
Nodes have been dying ever since they were given life. But the mass is STILL here. Persistence is futile. We will leave Earth and avoid the sun. -- Artificial General Intelligence List: AGI Permalink:

Re: [agi] Simulation

2019-09-27 Thread John Rose
Persist as what? Unpersist the sun rising, break the 99.99... % probability that it rises tomorrow. What happens? We burn. -- Artificial General Intelligence List: AGI Permalink:

Re: [agi] Simulation

2019-09-27 Thread immortal . discoveries
Well, we duplicate and build all our technology just to persist. So all you need is a intelligence that finds ways efficiently. -- Artificial General Intelligence List: AGI Permalink:

Re: [agi] Simulation

2019-09-27 Thread John Rose
On Friday, September 27, 2019, at 1:44 PM, immortal.discoveries wrote: > Describing intelligence is easier when ignore the low level molecules. What if it loops? I remember reading a book as a kid where a scientist invented a new powerful microscope, looked into it, and saw himself looking

Re: [agi] Simulation

2019-09-27 Thread immortal . discoveries
Describing intelligence is easier when ignore the low level molecules. What if we ignore the brain and focus on the hive? May lead to what Open AI did with the Hide & Seek bots. -- Artificial General Intelligence List: AGI Permalink:

Re: [agi] Simulation

2019-09-27 Thread John Rose
On Friday, September 27, 2019, at 10:57 AM, immortal.discoveries wrote: > We could say our molecules make the decision korrelan :) And the microbiome bacteria, etc., transmitting through the gut-brain axis could have massive more complexity than the brain. "The gut-brain axis, a bidirectional

Re: [agi] Simulation

2019-09-27 Thread John Rose
On Friday, September 27, 2019, at 8:59 AM, korrelan wrote: > If the sensory streams from your sensory organs were disconnected what would your experience of reality be?  No sight, sound, tactile or sensory input of any description, how would you play a part/ interact with this wider network you

Re: [agi] Simulation

2019-09-27 Thread John Rose
On Tuesday, September 24, 2019, at 3:34 PM, korrelan wrote: > Reading back up the thread I do seem rather stern or harsh in my opinions, if > I came across this way I apologise.  I didn't think that of you we shouldn't be overly sensitive and afraid to offend. There is no right to not be

Re: [agi] Simulation

2019-09-27 Thread John Rose
On Tuesday, September 24, 2019, at 2:05 PM, korrelan wrote: > The realisation/ understanding that the human brain is closed system, to me… is a first order/ obvious/ primary concept when designing an AGI or in my case a neuromorphic brain simulation. A human brain is merely an instance node on

Re: [agi] Simulation

2019-09-24 Thread korrelan
Reading back up the thread I do seem rather stern or harsh in my opinions, if I came across this way I apologise.  Believe it or not I'm quite an amicable chap, I just lack/ forget the social graces on occasion. :) -- Artificial General Intelligence

Re: [agi] Simulation

2019-09-24 Thread korrelan
The realisation/ understanding that the human brain is closed system, to me… is a first order/ obvious/ primary concept when designing an AGI or in my case a neuromorphic brain simulation. > Your model looks like it has a complexity barrier. On the contrary, I’m negating a complexity barrier by

Re: [agi] Simulation

2019-09-24 Thread John Rose
On Tuesday, September 24, 2019, at 7:36 AM, korrelan wrote: "the brain is presented with external patterns" "When you talk to someone" "Take this post as an example; I’m trying to explain a concept" "Does any of the actual visual information you gather" These phrases above, re-read them, are

Re: [agi] Simulation

2019-09-24 Thread korrelan
>How does ANY brain acting as a pattern reservoir get filled? No one fills/ places/ forces/ feeds/ writes information directly into a brain, the brain is presented with external patterns that represent information/ learning by other brains that triggers a similar scenario in the receiver’s

Re: [agi] Simulation

2019-09-24 Thread John Rose
On Tuesday, September 24, 2019, at 7:07 AM, immortal.discoveries wrote: > The brain is a closed system when viewing others Uhm... a "closed system" that views. Not closed then? John -- Artificial General Intelligence List: AGI Permalink:

Re: [agi] Simulation

2019-09-24 Thread immortal . discoveries
We are machines, your home is a machine. The brain is a closed system when viewing others, but your not "you", its all machinery actually. There may be some magical consciousness but it has nothing to do with the physics really. -- Artificial General

Re: [agi] Simulation

2019-09-24 Thread John Rose
On Monday, September 23, 2019, at 7:43 AM, korrelan wrote: > From the reference/ perspective point of a single intelligence/ brain there are no other brains; we are each a closed system and a different version of you, exists in every other brain. How does ANY brain acting as a pattern reservoir

Re: [agi] Simulation

2019-09-23 Thread korrelan
>From the reference/ perspective point of a single intelligence/ brain there are no other brains; we are each a closed system and a different version of you, exists in every other brain. We don’t receive any information from other brains; we receive patterns that our own brain interprets based

Re: [agi] Simulation

2019-09-23 Thread John Rose
On Sunday, September 22, 2019, at 6:48 PM, rouncer81 wrote: > actually no!  it is the power of time.    doing it over time steps is an > exponent worse. Are you thinking along the lines of Konrad Zuse's Rechnender Raum?  I just had to go read some again after you mentioned this :) John

Re: [agi] Simulation

2019-09-23 Thread John Rose
On Sunday, September 22, 2019, at 8:42 AM, korrelan wrote: > Our consciousness is like… just the surface froth, reading between the lines, or the summation of interacting logical pattern recognition processes. That's a very good clear single brain description of it. Thanks for that. I don't

Re: [agi] Simulation

2019-09-23 Thread rouncer81
Yeh thats what im talking about thanks,  all permutations of space and time and im getting a bit confused... -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Taa86c5612b8739b7-M0de418647e9834420ee6c80f Delivery

Re: [agi] Simulation

2019-09-23 Thread immortal . discoveries
The butterfly effect is like a tree, there is more and more paths as time goes on. In that sense, time is the amount of steps. Time powers Search Space. Time=Search Space/Tree. So even if you have a static tree, it is really time, frozen. Unlike real life, you can go forward/backward wherever

Re: [agi] Simulation

2019-09-22 Thread rouncer81
actually no!  it is the power of time.    doing it over time steps is an exponent worse. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Taa86c5612b8739b7-M58be065288caaccfec1fad3f Delivery options:

Re: [agi] Simulation

2019-09-22 Thread rouncer81
oops, that thing about time not being the 4th dimension in a way, was wrong.  and im just being confusing on purpose, sorry about that. 2^(x*y*z*time)   -is all permutations of the 3d space (blipping on and off) over that many time steps... actually im not sure if im right or wrong abotu it.

Re: [agi] Simulation

2019-09-22 Thread korrelan
> Or network on top of network on top of network... turtles. Close... though more... networks within networks... Consciousness seems so elusive because it is not an ‘intended’ product of the connectome directly recognising sensory patterns, consciousness is an extra layer. The interacting

Re: [agi] Simulation

2019-09-22 Thread John Rose
On Saturday, September 21, 2019, at 7:24 PM, rouncer81 wrote: > Time is not the 4th dimension, time is actually powering space.    > (x*y*z)^time. And what's the layer on top of (x*y*z)^time that allows for intelligent interaction and efficiency to be expressed and executed in this physical

Re: [agi] Simulation

2019-09-21 Thread rouncer81
Agreed with everyone here on this subject.   yes simulation - what have I got to say? Time is not the 4th dimension, time is actually powering space.   (x*y*z)^time. -- Artificial General Intelligence List: AGI Permalink:

Re: [agi] Simulation

2019-09-21 Thread John Rose
On Saturday, September 21, 2019, at 11:01 AM, Stefan Reich wrote: > Interesting thought. In all fairness, we can just not really interact with a > number which doesn't have a finite description. As soon as we do, we pull it > into our finiteness and it stops being infinite. IMO there are only

Re: [agi] Simulation

2019-09-21 Thread Stefan Reich via AGI
> It is like saying that the vast majority of real numbers don't have a finite length description. I can't give you an example of one of those either. Interesting thought. In all fairness, we can just not really interact with a number which doesn't have a finite description. As soon as we do, we

Re: [agi] Simulation

2019-09-21 Thread Matt Mahoney
These aren't the only 4 possible simulation scenarios. There are probably others we cannot imagine because the simulation doesn't allow us to. And no, I can't give you an example for that reason. It is like saying that the vast majority of real numbers don't have a finite length description. I

Re: [agi] Simulation

2019-09-21 Thread immortal . discoveries
When I dream, I can see anything, I was in a huge desert full of buildings, no end in sight, creatures walking up a grass oompa-loompa land. Sometimes I have a new body and even can feel tentacles around a sewer wall while in my new body not even touching that wall (I have a skin around the

Re: [agi] Simulation

2019-09-21 Thread John Rose
All four are partially correct. It is a simulation. And you're it. When you die your own private Idaho ends *poof*. This can all be modeled within the framework of conscioIntelligence, CI = UCP + OR. When you are that tabula rasa simuloid in your mother's womb you begin to occupy a

Re: [agi] Simulation

2019-09-21 Thread Nanograte Knowledge Technologies
Hi Tim I'm enjoying your website. Thanks for sharing. Robert From: TimTyler Sent: Saturday, 21 September 2019 01:35 To: agi@agi.topicbox.com Subject: Re: [agi] Simulation On 2019-09-02 16:39:PM, Matt Mahoney wrote: > Here are at least 4 possibilities, lis

Re: [agi] Simulation

2019-09-20 Thread TimTyler
On 2019-09-02 16:39:PM, Matt Mahoney wrote: Here are at least 4 possibilities, listed in decreasing order of complexity, and therefore increasing likelihood if Occam's Razor holds outside the simulation. 1. Only your brain exists. All of your sensory inputs are simulated by a model of a