A deep learning system set up for next sentence prediction, one that consumed
gigabytes of literature, would learn to mimic emotions as expressed in writing.
It would likely have mappings of context and events to plausible emotional
descriptions. It would have latent encodings about the
*list of things Cybermen do to make even the Dr yell
RN! and book it to the TARDIS as well here*
On Tue, Oct 18, 2022 at 6:35 PM Gillian Densmore
wrote:
> *terminator soundtrack here*
>
> On Tue, Oct 18, 2022 at 5:55 PM Prof David West
> wrote:
>
>> Maybe lack of emotion,
*terminator soundtrack here*
On Tue, Oct 18, 2022 at 5:55 PM Prof David West
wrote:
> Maybe lack of emotion, but ability to 'fake it' by repeating what it read
> a being with that emotion would say only proves the AI is a sociopath or
> psychopath.
>
> davew
>
>
> On Tue, Oct 18, 2022, at 4:44
Maybe lack of emotion, but ability to 'fake it' by repeating what it read a
being with that emotion would say only proves the AI is a sociopath or
psychopath.
davew
On Tue, Oct 18, 2022, at 4:44 PM, Russ Abbott wrote:
> When Blake Lemoine claimed that LaMDA was conscious, it struck me that
When Blake Lemoine claimed that LaMDA was conscious, it struck me that one
way to test that would be to determine whether one could evoke an emotional
response from it. You can't cause it physical pain since it doesn't have
sense organs. But, one could ask it if it cares about anything. If so,
I an concurrently reading, *Nineteen Ways of Looking at Consciousness*, by
Patrick House and *Mountain in the Sea*, by Ray Nayler. The latter is fiction.
(The former, because it deals with consciousness may also be fiction, but it
purports to be neuro-scientific / philosophical.)
The novel is
There are many different measures of *types* of consciousness. But without
specifying the type, such questions are not even philosophical. They're
nonsense.
For example, the test of whether one can recognize one's image in a mirror couldn't be
performed by a chatbot. But it is one of the
Paul Buchheit asked on
Twitterhttps://twitter.com/paultoo/status/1582455708041113600"Is consciousness
measurable, or is it just a philosophical concept? If an AI claims to be
conscious, how do we know that it's not simply faking/imitating consciousness?
Is there something that I could
Colloquially, "in your head" signifies that construct—the self, the
consciousness— that you do not accept as a real thing.
If one were to be charitable, one might interpret the sentence as "it is not
psychosomatic." Reason being pain results in signals in both the body and the
brain, so it is
hmmm so what does that about us when something is a royal PIT ? is
the then where your brain is located then? if so might explain a few
things. :P thank you, thank you. I'll see my self out.
On Tue, Oct 18, 2022 at 1:44 PM wrote:
> A highly regarded pain expert, Dr. Carmen Green,
A highly regarded pain expert, Dr. Carmen Green, talking about chronic pain
on the pod cast of an equally highly regarded neurosurgeon, Dr. Sanjay
Gupta:
". pain is also perceived in the brain, so it's not only in your head."
Nick Thompson
thompnicks...@gmail.com
11 matches
Mail list logo