Another boring cult, got it.

More relevant to this list would be human cultural experiments based on
social intelligence augmentation tools.  The social intelligence
augmentation singularity will bloom up many weird cults I suspect.  It's
also far more risky, but that's a feature, not a bug.

On Mon, Jul 9, 2018, 10:51 AM Steve Richfield via AGI <agi@agi.topicbox.com>
wrote:

> I am getting my act together to advance a plan to simultaneously maximize
> lifespan and the Flynn effect through organized personal preferences - sort
> of a cross between a new sexual orientation and a new religion. I suspect
> that an AGI electronic singularity would have little of value to offer in
> competition, while carrying a LOT of risk.
>
> Before posting any details that might color your thinking about this, I
> thought I should poll people here for your initial uncolored thoughts.
>
> Thoughts?
>
> Steve
> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> + delivery
> options <https://agi.topicbox.com/groups> Permalink
> <https://agi.topicbox.com/groups/agi/Te5ba7adc5f1878e5-M7449a33d547173f725baccf0>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Te5ba7adc5f1878e5-Mcf4919e1f58e0649629f8474
Delivery options: https://agi.topicbox.com/groups

Reply via email to