On Thu, Apr 29, 2021 at 1:57 AM 'Brent Meeker' via Everything List < [email protected]> wrote:
> > > On 4/28/2021 9:42 PM, Terren Suydam wrote: > > > > On Wed, Apr 28, 2021 at 8:15 PM 'Brent Meeker' via Everything List < > [email protected]> wrote: > >> >> >> On 4/28/2021 4:40 PM, Terren Suydam wrote: >> >> >> I agree with everything you said there, but all you're saying is that >> intersubjective reality must be consistent to make sense of other peoples' >> utterances. OK, but if it weren't, we wouldn't be here talking about >> anything. None of this would be possible. >> >> >> Which is why it's a fool's errand to say we need to explain qualia. If >> we can make an AI that responds to world the way we to, that's all there is >> to saying it has the same qualia. >> > > I don't think either of those claims follows. We need to explain suffering > if we hope to make sense of how to treat AIs. If it were only about redness > I'd agree. But creating entities whose existence is akin to being in hell > is immoral. And we should know if we're doing that. > > > John McCarthy wrote a paper in the '50s warning about the possibility of > accidentally making a conscious AI and unknowingly treating it > unethically. But I don't see the difference from any other qualia, we can > only judge by behavior. In fact this whole thread started by JKC > considering AI pain, which he defined in terms of behavior. > > A theory would give you a way to predict what kinds of beings are capable of feeling pain. We wouldn't have to wait to observe their behavior, we'd say "given theory X, we know that if we create an AI with these characteristics, it will be the kind of entity that is capable of suffering". > > To your second point, I think you're too quick to make an equivalence > between an AI's responses and their subjective experience. You sound like > John Clark - the only thing that matters is behavior. > > > Behavior includes reports. What else would you suggest we go on? > Again, in a theory of consciousness that explains how qualia come to be within a system, you could make claims about their experience that go beyond observing behavior. I know John Clark's head just exploded, but it's the point of having a theory of consciousness. > > > Bent > > -- > You received this message because you are subscribed to the Google Groups > "Everything List" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to [email protected]. > To view this discussion on the web visit > https://groups.google.com/d/msgid/everything-list/577ce844-a528-4dcd-deab-3cf1e5e833e8%40verizon.net > <https://groups.google.com/d/msgid/everything-list/577ce844-a528-4dcd-deab-3cf1e5e833e8%40verizon.net?utm_medium=email&utm_source=footer> > . > -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/CAMy3ZA-joR0sTiicxUM7vpjcgw-wrGHv3Oa24AJigv7%2B5RHefA%40mail.gmail.com.

