On 4/28/2021 9:42 PM, Terren Suydam wrote:
On Wed, Apr 28, 2021 at 8:15 PM 'Brent Meeker' via Everything List <[email protected] <mailto:[email protected]>> wrote:On 4/28/2021 4:40 PM, Terren Suydam wrote:On Wed, Apr 28, 2021 at 7:25 PM 'Brent Meeker' via Everything List <[email protected] <mailto:[email protected]>> wrote: On 4/28/2021 3:17 PM, Terren Suydam wrote:On Wed, Apr 28, 2021 at 5:51 PM John Clark <[email protected] <mailto:[email protected]>> wrote: On Wed, Apr 28, 2021 at 4:48 PM Terren Suydam <[email protected] <mailto:[email protected]>> wrote: />>> testimony of experience constitutes facts about consciousness./ >> Sure I agree, provided you firstaccept that consciousness is the inevitable byproduct of intelligence /> I hope the irony is not lost on anyone that you're insisting on your theory of consciousness to make your case that theories of consciousness are a waste of time./ If you believe in Darwinian evolution and if you believe you are consciousthen given that evolution can't select for what it can't see and natural selection can see intelligent behavior but it can't see consciousness, can you give me an explanation of how evolution managed to produce a conscious being such as yourself if intelligence is not the inevitable byproduct of intelligence? It's not an inevitable byproduct of intelligence if consciousness is an epiphenomenon. As you like to say, consciousness may just be how data feels as it's being processed. If so, that doesn't imply anything about intelligence per se, beyond the minimum intelligence required to process data at all... the simplest example being a thermostat. That said, do you agree that testimony of experience constitutes facts about consciousness?It wouldn't if it were just random, like plucking passages out of novels. We only take it as evidence of consciousness because there are consistent patterns of correlation with what each of us experiences. If every time you pointed to a flower you said "red", regardless of the flower's color, a child would learn that "red" meant a flower and his reporting when he saw red wouldn't be testimony to the experience of red. So the usefulness of reports already depends on physical patterns in the world. Something I've been telling Bruno...physics is necessary to consciousness. Brent I agree with everything you said there, but all you're saying is that intersubjective reality must be consistent to make sense of other peoples' utterances. OK, but if it weren't, we wouldn't be here talking about anything. None of this would be possible.Which is why it's a fool's errand to say we need to explain qualia. If we can make an AI that responds to world the way we to, that's all there is to saying it has the same qualia.I don't think either of those claims follows. We need to explain suffering if we hope to make sense of how to treat AIs. If it were only about redness I'd agree. But creating entities whose existence is akin to being in hell is immoral. And we should know if we're doing that.
John McCarthy wrote a paper in the '50s warning about the possibility of accidentally making a conscious AI and unknowingly treating it unethically. But I don't see the difference from any other qualia, we can only judge by behavior. In fact this whole thread started by JKC considering AI pain, which he defined in terms of behavior.
To your second point, I think you're too quick to make an equivalence between an AI's responses and their subjective experience. You sound like John Clark - the only thing that matters is behavior.
Behavior includes reports. What else would you suggest we go on? Bent -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/577ce844-a528-4dcd-deab-3cf1e5e833e8%40verizon.net.

