On Thu, Jul 11, 2019 at 11:09 PM Bruno Marchal <marc...@ulb.ac.be> wrote:

> On 11 Jul 2019, at 14:23, Bruce Kellett <bhkellet...@gmail.com> wrote:
>
> On Thu, Jul 11, 2019 at 8:40 PM Bruno Marchal <marc...@ulb.ac.be> wrote:
>
>>
>> > On 10 Jul 2019, at 23:04, 'Brent Meeker' via Everything List <
>> everything-list@googlegroups.com> wrote:
>> >
>> >
>> >
>> > On 7/10/2019 7:59 AM, Bruno Marchal wrote:
>> >> The machine define by the two following equations Kxy = x and Sxyz =
>> xz(yz) + S ≠ K, and with the combinator induction axiom (that I gave some
>> posts ago) is already as much conscious than you and me.
>> >
>> > Which in it self is a reductio of your theory.
>>
>> Why? If you agree with the definition of consciousness that I have given
>> (true, knowable, non provable, non definable without invoking truth) then
>> SK+induction *is* provably conscious, and indeed has the G/G* theology
>> applicable to it.
>>
>
> Come now. That is just the cat=dog argument for which I have often
> criticised you. You take a superficial resemblance between two things and
> claim identity.
>
>
>
> Not identity, but equivalence.
>

Is not identity an equivalence relationship? You are chopping logic.


> Conscience is a general term, like cat and dog are both quadrupeds
> mammals.
> You are criticising the axiomatic method.
>

Science is not axiomatic.


> I certainly do not identify the many consciousness possible, as numerous
> as possible persons, human or not, and in fact, the works shows the
> existence of very variate forms of consciousness.
>

> It is not because both dog and cat are quadrupeds mammals that Dog = Cat.
>

No, one can point to many more dissimilarities than there are similarities.
So your attempt at equivalence or identity between your pathetically
inadequate definition of consciousness and your combinator logic fails at
every level. In other words, you have not 'explained' consciousness -- you
are not even talking about consciousness as usually understood.

But if you are OK that consciousness is characterised by the quasi
> axiomatic I give, then all universal machine can be said to be conscious,
> and the Löbian machine can be said to be self-conscious.
>

I reject your definition of consciousness as totally inadequate. As Brent
points out, it does not even begin to cover important aspects of
consciousness, such as awareness of an environment.


> Very poor logic, I must say.
>
>
> It is called the axiomatic method, and it is the jewel brought by modern
> logic. The idea is to characterise things by searching some principles on
> which we agree about those things, letting open that we might later add
> incompatible proposition to gives different examples of the thing, like we
> could add “barking” to "quadruplet mammals” if we want distinguish dog from
> cat.
>

And we could add "interacting with an environment", or "capable of
autonomous action", or "can pass a Turing Test" to the list of
characteristics of consciousness. None of these additional features are
satisfied by your combinators, so your equivalence relationship is far from
being satisfied.


Since Gödel we have good reason for doing that, because we know that all
> concepts “rich enough” cannot be defined at all, but can be characterised
> by first order logical axiomatic system, or by definition *in* such system.
> We can’t do better a priori, unless we are gods or something non Turing
> emulable. All theories about computer programs are essentially undecidable,
> and most concept there are not univocally definable: you need to add non
> computable set of postulates to characterise them univocally, which of
> course cannot be done.
>
> "True, knowable, non-provable, definable without invoking truth" is but a
> poor definition of consciousness,
>
> That is an opinion, and I have no clue why you say this.
>

That would suggest that you don't know what consciousness is.


> Keep in mind that we have already precise mathematical definition of truth
> (for the simple Löbian machine), provable, knowable, non-definable, and
> that all what <I say makes sense thanks to the theorems of Gödel 1931, and
> Löb 1955, and Solovay 1976 (on G and G*).
>

What has 'truth' got to do with it? Is an axiom conscious?


> even if it may be a property of your feeble combinators
>
> Feeble?
>

Yes, feeble. You put your combinators and a  logic text into a room and
shut the door. They couldn't even report back the colour of the wallpaper,
much less initiate any autonomous action, or pass a Turing test.

Bruce

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAFxXSLRPshfFByAsJHw5WcVNJ0sB%2BW6h-ZorC-%3Dj-mA-w9j5rQ%40mail.gmail.com.

Reply via email to