I'll need to digest your more-abstract comments.
"For safety it might help for AGI and perhaps its unavoidable to partially exist within the human/biological rendered simulation." I guess the above is similar to what I'm saying. It's against my better judgement to hold this view, but to advance, we might have to seriously consider your suggestion. Once we achieved a state of in-situ AGI, could be our options to translate that into machine-enabled form would become clear. Let's go science fiction on this. Imagine standing in front of an AGI vending machine at an international airport. You are presented with 3 versions of AGI, which you could transform into for your trip. You step inside the booth, select your option and gender, pay, and undergo the central nervous system memory, brain, and consciousness porting transformation. Could be your physical body was left comatose and carted off on a conveyer belt for storage until your return. Alternatively, could be you step out of the booth in a machine exo-form representing characteristics of your choice, skin of your skin, bio-plugged into the humanoid form via central nervous system integration. You, on the inside, tactile attached to the exo form, ready to live augmented reality to the full, seamlessly plugged into everything and everyone around you. If such an objective could be achieved without minimizing the authenticity of the image of original AGI, satisfying ethical science - which imperative is voluntary - that may be the speediest way forward. ________________________________ From: John Rose <[email protected]> Sent: Sunday, 07 March 2021 18:35 To: AGI <[email protected]> Subject: Re: [agi] Patterns of Cognition On Sunday, March 07, 2021, at 5:00 AM, Nanograte Knowledge Technologies wrote: Having said that, I'm not against examining the possibility of a special kind of simulation, one we have not quite managed to find the correct words and description for. Bearing in mind, that all we'd be doing by becoming AGI was to simulate our characteristic selves as a generalized species with intelligence. Perhaps, there's a secret switch somewhere, a mode switch? You cover a lot I'll hit on a couple of items. Duality needn't be crisp. In fact, I think nothing is purely crisp except models/virtualizations. Duality in regards to “this” would essentially be a communication protocol item at the middle to upper layer when alluding to something like OSI network layers. Duality is a construct and can be modelled as a non-crisp binary logic emerged from a quantum layer since intelligent agents are distributed and need to operate and survive, make choices, etc.. There are multiple simulations but the one that is guaranteed IMO is the biological/human simulation we create/created and exist in. Other simulations are speculative AFAIK though they may pertain, they may utilizable as alternate computing methods… For safety it might help for AGI and perhaps its unavoidable to partially exist within the human/biological rendered simulation. “This” is still attainable from non-quantum computing methods but it wouldn’t equal a human level “this”. An artificial agent can still render its perceptive complexity of reality and model/compute a “this”. That particular “this” though would be lacking in certain features like non-locality but non-locality is still very modellable. Quantum computing, I agree is a game changer. The recording you posted is interesting in that I think it displays a lower layer from duality but it still has to transmit through duality for us to see. Artificial General Intelligence List<https://agi.topicbox.com/latest> / AGI / see discussions<https://agi.topicbox.com/groups/agi> + participants<https://agi.topicbox.com/groups/agi/members> + delivery options<https://agi.topicbox.com/groups/agi/subscription> Permalink<https://agi.topicbox.com/groups/agi/Ta5ed5d0d0e4de96d-M8202efbc6e45825771f8ebc8> ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Ta5ed5d0d0e4de96d-M16c9f2db77f648a456163caa Delivery options: https://agi.topicbox.com/groups/agi/subscription
