@Matt :P Ok you certainly got a whole grand study on this don't you :) Well, it ain't a bad one, I must say. Sounds reasonable.
You seem to be bringing up that big brained, wealthy, and AI-loving folks are essentially suiciding themselves. But I'm thinking that big brains leads to some doing some complex things that are not the good complex things but the bad (and not many do the bad, only say 3%, ya i know you said it is doubling but maybe it will pale down once AGI clone rapidly). Until you can explain how wealthy people would suicide, I don't understand how that works. To me it seems higher intelligence prevails and would use that resources better and store what is not needed for now. Smarts means you know wireheading kills oneself, which stops your future plan at "wireheading" yourself further. The only way to wirehead yourself safely is to clone and clone etc in the galaxy your fleet and so this is repeated and easy to access but it is growth only, with all the needed defenses repairs etc abilities and machinery, etc, in each unit cloned. Yes AI loving folks are kinda killing themselves, but it will soon be God and give them everything for real.....including immortality. So AFAIK, all 3 these is a sign you are intelligent and doing good and happy. I just remembered yes, since we are machines, we can't say we are happier than older machines, since they only expected what they were going to back then! They worked as good as they could. We might (not even sure if we do yet) live longer and have a larger army of ourselves unlike other animals, or soon.... But besides that, are just other types of machines, all machines no better, but some prevailing simply and more common than other types. However I think /here/ sits where you are mixing this up. If you take an old machine, like say a human, and throw him into a utopian singularity AI driven ultra crazy advanced world of everything ever can have, you suddenly have now an old machine, that expect a handful of bread Only, now receiving truck loads of glory he is in awe at now. The AIs expect it, but not humans. Ok now I am thinking it makes no sense. The idea was a dud I just had wasn't it? But maybe the gal is.....escape velocity for immortality? If I can know I can't die at some point, then not Only do I get rid of some pains and issues along the way in my routine in the colony in space, but I also - while not know it and can't cherished it as I already am high af - get to live way longer than other machines - perhaps infinitely long. This, then, while not sensed, might be the one way we can say OK, so this future changed how erm happy they all are, and while it isn't any different, they at least get to live forever (and as mentioned too, have less bumps in their daily highs). On Friday, August 11, 2023, at 1:06 PM, Matt Mahoney wrote: > Pair bonding evolved in humans, like in prairie voles and some bird species, > because children raised by 2 parents had better survival odds when the child > mortality rate was 50-75%. Humans are the only primates that fall in love > after sex and the only mammals that don't go into heat or that have sex when > not ovulating or that cover their reproductive organs to suppress sexual > signaling. When humans evolved language, it enabled rapid memetic evolution > of religion and social rules to maximize reproduction. Those rules can be > abandoned just as quickly, resulting in population decline. Wouldn't 3 parents be better? While the man goes hunting, you can let the 2 women stay behind with the 2 children say, versus one women being alone elsewhere with 0 children also. Here, you have a larger group. Maybe 2 is like how the brain makes hierarchy? Bind 2 at a time lol. Even if 2 is best then, one could still switch with others to have best of both worlds. ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T772759d8ceb4b92c-M38fda430412b62ecad8d4779 Delivery options: https://agi.topicbox.com/groups/agi/subscription
