Re: [agi] Honestly?

2018-09-24 Thread Robert Levy via AGI
I just recently wrote some thoughts on twitter about this.  My thoughts
there are a rough sketch at bridging the gap of doubt between personal
experience on one hand and the compelling case for cybernetic totalism on
the other.  I'm thinking of expanding this is into a full article to better
make the case, but this is just some informal off-the cuff thoughts
exploring my own reasons for viscerally doubting the physical system
explanation of experience, and the reasoning that brings me to resolve that
and see how it is feasible.  If this were a real article and not a tweet
thread I would draw on Hutto & Myin's "Radicalizing Enactivism" book more
explicitly, which makes one of the best cases I've seen for why
naturalistic realism is enough, and that dualist duplication of correlates
in a mental realm of qualia etc is unnecessary.

https://twitter.com/rplevy/status/1043934649834037249

On Tue, Sep 18, 2018 at 6:07 PM Jim Bromer via AGI 
wrote:

> I already regret asking these questions, but do you truly (really -
> honestly) believe that:
> Conscious Experience or soul or Qualia or the experience of being (or
> whatever you want to call it) does not actually exist (or occur)?
> and/or
> This experience (whatever you want to call it) can therefore occur in
> a computer program?
> Jim Bromer

--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T2e5182d7ce6527f7-Mf46365897a6922d4a4cd3763
Delivery options: https://agi.topicbox.com/groups/agi/subscription


[agi] Judea Pearl on AGI

2018-09-13 Thread Robert Levy via AGI
I don't think I've seen a discussion on this mailing list yet about Pearl's
hypothesis that causal inference is the key to AGI.  His breakthroughs on
causation have been in use for almost 2 decades.  The new Book of Why,
other than being the most accessible presentation of these ideas to a
broader audience, is interesting in that it expressly goes into applying
causal calculus to AGI.

--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T0f9fecad94e3ce7e-M9e6c354c9f8ac56c414a651f
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] Human Singularity

2018-07-10 Thread Robert Levy via AGI
I thought MP stood for Mentifex Proxy.  ;)  I kid, i kid!

On Mon, Jul 9, 2018 at 10:45 PM MP via AGI  wrote:

> I’d be very interested in such a philosophy. I’ve always had in mind that
> true AGI would, in time anyway, become a "deity" in a very logical sense of
> the word. I do revere the idea anyway... call it a religion or a cult, but
> it sounds like something I can sink my teeth into regardless.
>
> You can reach me privately at mind pixel at proton mail dot com if you’d
> rather chat there. Definitely interested.
>
>
> Sent from ProtonMail Mobile
>
>
> On Mon, Jul 9, 2018 at 12:49 PM, Steve Richfield via AGI <
> agi@agi.topicbox.com> wrote:
>
> I am getting my act together to advance a plan to simultaneously maximize
> lifespan and the Flynn effect through organized personal preferences - sort
> of a cross between a new sexual orientation and a new religion. I suspect
> that an AGI electronic singularity would have little of value to offer in
> competition, while carrying a LOT of risk.
>
> Before posting any details that might color your thinking about this, I
> thought I should poll people here for your initial uncolored thoughts.
>
> Thoughts?
>
> Steve
>
> *Artificial General Intelligence List *
> / AGI / see discussions  +
> participants  + delivery
> options  Permalink
> 
>

--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Te5ba7adc5f1878e5-M5241be87c0564fd7920f11ac
Delivery options: https://agi.topicbox.com/groups


Re: [agi] Human Singularity

2018-07-09 Thread Robert Levy via AGI
Another boring cult, got it.

More relevant to this list would be human cultural experiments based on
social intelligence augmentation tools.  The social intelligence
augmentation singularity will bloom up many weird cults I suspect.  It's
also far more risky, but that's a feature, not a bug.

On Mon, Jul 9, 2018, 10:51 AM Steve Richfield via AGI 
wrote:

> I am getting my act together to advance a plan to simultaneously maximize
> lifespan and the Flynn effect through organized personal preferences - sort
> of a cross between a new sexual orientation and a new religion. I suspect
> that an AGI electronic singularity would have little of value to offer in
> competition, while carrying a LOT of risk.
>
> Before posting any details that might color your thinking about this, I
> thought I should poll people here for your initial uncolored thoughts.
>
> Thoughts?
>
> Steve
> *Artificial General Intelligence List *
> / AGI / see discussions  +
> participants  + delivery
> options  Permalink
> 
>

--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Te5ba7adc5f1878e5-Mcf4919e1f58e0649629f8474
Delivery options: https://agi.topicbox.com/groups