I found it too vague.  The devil is in the details.  At some point you have to 
commit to an action, otherwise the world will pass you by.  The problempeople 
have is not knowing in advance the outcomes of actions we committo.  The shocks 
(unanticipated negative consequences) and surprises help to drive our learning. 
I don't think an AGI would prefer sitting on the fence.I think it would commit 
to random actions, then deliberate actions, then experimental actions to 
understand the consequences of those actions.
But that's just what I think.  Opinions abound like kangaroos around here.
~PM
Date: Mon, 10 Feb 2014 15:14:26 -0800
Subject: Re: [agi] A new equation for intelligence?
From: [email protected]
To: [email protected]

This seems like a rather poor definition of intelligence.  There is no way to 
get a handle on all future possibilities.  It is also not guaranteed that 
intelligence will not eliminate all but a few alternatives that it prefers.  It 
is not obvious that it will simply keep as many options open as possible. 


On Mon, Feb 10, 2014 at 3:03 PM, Mike Archbold <[email protected]> wrote:

Just a note that this is an example of a physics-first to AGI.  Is

there any science that HASN'T been used a starting point for AGI?

There really should be a science of starting points in AI.  To me it

should be metaphysics, but that is just me, I realize....



On 2/9/14, John Rose <[email protected]> wrote:

> Intelligence has heat and heat dissipates.

>

>

>

> Did I miss anything?

>

>

>

> Oh and Entropica can play pong with itself J

>

>

>

> John

>

>

>

> From: Tim Tyler [mailto:[email protected]]

> Sent: Sunday, February 9, 2014 3:51 PM

> To: AGI

> Subject: [agi] A new equation for intelligence?

>

>

>

> Here's Alex Wissner-Gross's TED presentation - on the

> "maximizing options" theory of intelligence and "Entropica".

>

> "Alex Wissner-Gross: A new equation for intelligence"

>

>  - https://www.youtube.com/watch?v=ue2ZEmTJ_Xo

>

> It seems as though this is a redefinition of intelligence.  Intelligence,

> conventionally, involves a broad-spectrum ability at achieving goals.

> Keeping your options open seems more like a common instrumental

> value.

>

> Go and chess playing are not, in fact, about "keeping your options

> open". They are all about winning the game.  If that involves

> eliminating future options and bringing the game to an end, so be it.

>

> I'm actually a fan of the idea of a strong link between entropy

> generation and intelligence.  Rather ironically, I see the link as

> going the other way - at least most of the time. As intelligence

> systems evolved, they have got better and better at seeking out

> energy gradients and dissipating them. That's why we developed

> nuclear fission, for instance.

>

> Such behaviour doesn't "keep future options open" - rather it

> accelerates universal heat death.  Intelligent systems might

> *sometimes* conserve energy resources - in the way that Wissner-Gross

> suggests - but it is much more common for them to burn through

> them and convert them into offspring - and in those few cases

> where resources are conserved, it is *usually* to burn through

> them only *slightly* later on.

>

> The "keeping your options open" theory of intelligence seems

> silly to me. I'm concerned that the real intelligence-entropy

> link will be polluted by association with this daft idea.

> --

> __________

>  |im |yler  http://timtyler.org/  [email protected]  Remove lock to reply.

>

>

> AGI |  <https://www.listbox.com/member/archive/303/=now> Archives

> <https://www.listbox.com/member/archive/rss/303/248029-3b178a58> |

> <https://www.listbox.com/member/?&;>

> Modify Your Subscription

>

>  <http://www.listbox.com>

>

>

>

>

>

>

> -------------------------------------------

> AGI

> Archives: https://www.listbox.com/member/archive/303/=now

> RSS Feed: https://www.listbox.com/member/archive/rss/303/11943661-d9279dae

> Modify Your Subscription:

> https://www.listbox.com/member/?&;

> Powered by Listbox: http://www.listbox.com

>





-------------------------------------------

AGI

Archives: https://www.listbox.com/member/archive/303/=now

RSS Feed: https://www.listbox.com/member/archive/rss/303/2997756-fc0b9b09

Modify Your Subscription: https://www.listbox.com/member/?&;


Powered by Listbox: http://www.listbox.com






  
    
      
      AGI | Archives

 | Modify
 Your Subscription


      
    
  

                                          


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to