Based on in-between-life (soul world) hypnotic regression studies the point of 
life is learning, skills, abililies, and generally improving.  Later on,  after 
some development may choose a focus or mission, that either spans multiple 
incarnations like a bodhisattva, or is soul world bound, like that of a soul 
adviser. 

Anyways suffice to say that with high level robot bodies the learning 
opportunities and possible missions only increase, so fewer high level souls 
will be soul world bound. Or at least more could find new learning experiences.



On March 26, 2017 2:07:55 PM EDT, TimTyler <[email protected]> wrote:
>On 2017-03-22 11:32, Telmo Menezes wrote:
>> Hi Tim,
>>
>> Ok, I don't disagree. I would just argue that being condemned to live
>> in a zoo could be considered an even worse outcome that extinction.
>> No?
>
>There's certainly a school of self-proclaimed hedonists
>that identifies the normal resting state as hedonic zero,
>and categorizes anything below that as "suffering". Vegans
>use this type of reasoning to justify the notion that it
>would be better for factory-farmed animals if they had never
>been born. There's even a book titled "Better Never to Have
>Been: The Harm of Coming into Existence" that makes a
>similar argument for humans.
>
>I generally favor an alternative position which I
>refer to as "the ecstasy of existence". According
>to this idea, not existing is very bad, and you
>have to go a long way into the realms of pain and
>fear before you reach states that are comparably bad.
>I claim that this position is better supported by
>evolutionary theory, and by the  low incidence of
>suicide. In practice, most creatures prefer suffering
>to death - at least up to a point.
>
>My position identifies far future humans in historical
>simulations as being likely to lead reasonable lives -
>and that's better than not existing at all.
>
>Of course, there's also the much-discussed hypothesis
>that machines will subject those who oppose their
>construction to eternal torture under simulation -
>in order to better motivate their originals. Under
>such a "hellish" scenario, many far future humans would
>lead pretty miserable lives. However, it doesn't seem
>very plausible that machines will bother making
>credible commitments to subject humans to eternal
>torture. Such machines would not be very popular
>and we probably won't build them.
>
>[snip history and link to http://matchingpennies.com/far_future_humans/
>]
>
>-- 
>__________
>  |im |yler http://timtyler.org/
>
>
>
>-------------------------------------------
>AGI
>Archives: https://www.listbox.com/member/archive/303/=now
>RSS Feed:
>https://www.listbox.com/member/archive/rss/303/5037279-a88c7a6d
>Modify Your Subscription:
>https://www.listbox.com/member/?&;
>Powered by Listbox: http://www.listbox.com

-- 
Sent from my Android device with K-9 Mail. Please excuse my brevity.


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to