Stathis writes > > Do you imagine that it's possible that we could go to > > another star, and encounter beings who discoursed with > > us about every single other thing, yet denied that they > > had consciousness, and professed that they had no idea > > what we were talking about? > > The above question is a version of the zombie problem, and there are two > slightly different answers depending on whether you are talking about human > zombies or zombies from another planet. Human zombies are easy: they're not > really zombies.
I agree, and add the reasons you state to ones I already have. > I would still say that even if it could somehow > be shown that appropriate brain states necessarily lead to conscious states, > which I suspect is the case, it would still not be clear how this comes > about, and it would still not be clear what this is like unless you > experience the brain/conscious state yourself, or something like it. I anticipate that in the future it will, as you say so well, be shown that "appropriate brain states necessarily lead to conscious states", except I also expect that by then the meaning of "conscious states" will be vastly better informed and filled-out than today. In particular, the concept will have migrated from a mix of 1st and 3rd person notions, to entirely 3rd person notions. I speculate that after this occurs, people won't consider the old 1st person notion to be of much value (after all, you can't really use it to communicate with anyone about anything). But of course, all that is just speculation. > You could dismiss this as unimportant, but I think it makes 1st person > experience fundamentally different from everything else in the universe. Yes, but I don't think that there is any answer to the "hard problem". Concretely, I conjecture that of the 100000^5000 or so possible strings of 5000 words in the English language, not a single one of them solves this problem. But if so, then even after the scientific problem of learning what brain circuits are probably necessary and sufficient to capture all conscious appearing behaviors, then there will still be people who feel mystified. > As for aliens, I don't see how we could possibly assume that organisms who > did not even evolve on our planet have anything in common with us mentally. > They may be more fundamentally alien and different to us than bats or > lobsters are, and it may be completely impossible to empathise with them, > even if we could somehow tap into their minds. I agree that nothing is for sure; yet I'd expect that they'd have incentives to metabolically survive, and have a will to exert control over their immediate environments just as we do. So there would eventually be negotiations, and I'm confident that even empathy for them would develop. In your other email you write > ...I would have to repeat my reply to Jonathan Colvin, which > is that we basically agree on the facts of the matter but > choose to appraise/ interpret/ describe them in a different way. It's possible that we have reached that point so far as I'm concerned now; thanks for continuing. Lee