@ Russ

Exciting! Thanks for this contribution.

The diagram maps closely to what Gell-Mann (1994) describes as, and 
diagrammatically represents to be, Large Effective Complexity and Intermediate 
AIC (Algorithmic Information Content). The state of largest, effective 
complexity matches very closely to this result.

He summarizes his sketch, as a "crude illustration" to mean:
"...effective complexity of a system (relative to a properly functioning 
complex adaptive system as observer) varies with AIC, attaining high values 
only in the intermediate region between excessive order and excessive 
disorder." Further, he points out that: "Many important quantities that occur 
in discussions of simplicity, complexity, and complex adaptive systems [such as 
the AGI forum] share the property that they can be large only in that 
intermediate region." (p.60)

The above relates to information compressionability.

Rob
   

Date: Wed, 18 Feb 2015 17:51:18 -0600
Subject: Re: [agi] Couple thoughts
From: [email protected]
To: [email protected]

...and now fo something completely different...
http://mindblog.dericbownds.net/2015/02/psilocybin-as-key-to-consciousness.html
Regardless of the space cowboy nature to the title of this link's blog post 
that appeared today, there is relevant research behind it that touches on some 
of the well made points presented in this thread.

On Wednesday, February 18, 2015, Mike Archbold via AGI <[email protected]> wrote:
On 2/17/15, Matt Mahoney via AGI <[email protected]> wrote:

> On Wed, Feb 18, 2015 at 12:09 AM, Mike Archbold <[email protected]>

> wrote:

>> I think under this approach, it "bans" work on trying to actually

>> figure out the answers to these tough questions, and instead places

>> emphasis on replicating the means (mechanics of the brain) that

>> generates whatever it is we call consciousness.  I mean, under this

>> school of thought, if we don't know how  consciousness  solves hard

>> problems, so what?  As long as it works by copying the

>> physics/mechanics/etc of the brain (that being the obviously gigantic

>> challenge, of course) that is all that is required.

>

> Consciousness is the feelings (reinforcement signals, mostly positive)

> that you associate with sensory perception and thoughts (recalled

> memories) as they are written into episodic memory (memory associated

> with a time or place). It only seems mysterious because reinforcement

> signals alter your beliefs. Your brain works that way because it

> increases your reproductive fitness. Even though you can't objectively

> believe what I just stated, you want your consciousness to continue by

> not dying.

>

> You don't need to model consciousness to solve most AI problems like

> vision, language, or robotics. You do need to model belief in

> consciousness as well as other types of reinforcement learning (such

> as beliefs in free will and identity) in order to model or predict

> human behavior. It is not hard to do that once you understand where

> these illusions come from.

>

> It is a distraction to think that you have to replicate consciousness

> to solve AI. It is like thinking that birds fly by some magic that you

> have to replicate in order to build airplanes.

>

> --

> -- Matt Mahoney, [email protected]

>

>



I see your point.  It contentious.  There is still a certain appeal

to any approach that tries to skirt completely around the, well,

contentious problems.  Neural nets, as you know, have always been that

way, just a black box for the most part.  Whole brain emulation, I

think, which to me means a straight copy of the highest fidelity,

should be taken into account, even if you don't agree with someone's

approach under that flag.  If consciousness is or isn't part of it is

not as important as whether or not the copy is faithful to the

original brain.  If so, it will work.





> -------------------------------------------

> AGI

> Archives: https://www.listbox.com/member/archive/303/=now

> RSS Feed: https://www.listbox.com/member/archive/rss/303/11943661-d9279dae

> Modify Your Subscription:

> https://www.listbox.com/member/?&;

> Powered by Listbox: http://www.listbox.com

>





-------------------------------------------

AGI

Archives: https://www.listbox.com/member/archive/303/=now

RSS Feed: https://www.listbox.com/member/archive/rss/303/18488709-8cf25195

Modify Your Subscription: https://www.listbox.com/member/?&;

Powered by Listbox: http://www.listbox.com





  
    
      
      AGI | Archives

 | Modify
 Your Subscription


      
    
  

                                          


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to