On 2/4/2015 11:40 AM, Bruno Marchal wrote:

On 03 Feb 2015, at 20:40, meekerdb wrote:

On 2/3/2015 11:13 AM, Jason Resch wrote:
I agree with John. If consciousness had no third-person observable effects, it would be an epiphenomenon. And then there is no way to explain why we're even having this discussion about consciousness.

I'm not arguing that it has no observable effects. JKC says it's necessary for intelligence. I'm arguing that might have been necessary for for the evolution of intelligence starting from say fish. But that doesn't entail that is necessary for any intelligent system.

It is not necessary for any competent system, but intelligence is not competence, it is more like an understanding of our own incompetence, an ability to learn, notably through errors and "dreams".

Why isn't learning just a matter of increasing competence based on experience? I don't see that learning is any different that other competences.






If we build computers that discuss and question their own consciousness and qualia I'd consider that proof enough that they are.

But is that the standard of intelligence? JKC argues intelligence=>consciousness. What if they discuss and question their own consciousness, but say stupid things about it?

That's what do intelligent systems: they say stupid things.
Intelligence just add the interrogation sign '?" behind them. It is harm reduction for everybody. It helps for the next change of mind.



The bigger question, is what machines might be conscious yet unable to talk about, reflect upon, or signal to us that they are in fact conscious? This requires a theory of consciousness.

Exactly. That is my concern. Suppose we build an autonomous Mars Rover to do research. We give it learning ability, so it must reflect on its experience and act intelligently. Have we made a conscious being? Contrary to Bruno, I think there are kinds and degrees of consciousness - just as there are kinds and degrees of intelligence.

It will be conscious at the place where it confuses itself with the (relatively real) environment. OK. It depends also on its abilities, and you can make it self-conscious by adding enough induction axioms. Don't put to much induction axioms, as Mars Rover will get stuck in self dialog about its consciousness and how to convince those self-called [censored] humans!

So without the to-many induction axioms it will be conscious, but not self-conscious. Thus you agree that consciousness is not all-or-nothing.

Brent






--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to