2012/10/22 Stephen P. King <stephe...@charter.net>

> On 10/21/2012 7:14 PM, Stathis Papaioannou wrote:
>
>> On Mon, Oct 22, 2012 at 1:55 AM, Stephen P. King <stephe...@charter.net>
>> wrote:
>>
>>  If there is a top-down effect of the mind on the atoms then there we
>>>> would expect some scientific evidence of this. Evidence would
>>>> constitute, for example, neurons firing when measurements of
>>>> transmembrane potentials, ion concentrations etc. suggest that they
>>>> should not. You claim that such anomalous behaviour of neurons and
>>>> other cells due to consciousness is widespread, yet it has never been
>>>> experimentally observed. Why?
>>>>
>>>
>>> Hi Stathis,
>>>
>>>      How would you set up the experiment? How do you control for an
>>> effect
>>> that may well be ubiquitous? Did you somehow miss the point that
>>> consciousness can only be observed in 1p? Why are you so insistent on a
>>> 3p
>>> of it?
>>>
>> A top-down effect of consciousness on matter could be inferred if
>> miraculous events were observed in neurophysiology research. The
>> consciousness itself cannot be directly observed.
>>
>
> Hi Stathis,
>
>     This would be true only if consciousness is separate from matter, such
> as in Descartes failed theory of substance dualism. In the dual aspect
> theory that I am arguing for, there would never be any "miracles" that
> would contradict physical law. At most there would be statistical
> deviations from classical predictions. Check out
> http://boole.stanford.edu/pub/**ratmech.pdf<http://boole.stanford.edu/pub/ratmech.pdf>for
>  details. My support for this theory and not materialism follows from
> materialism demonstrated inability to account for 1p. Dual aspect monism
> has 1p built in from first principles. BTW, I don't use the term "dualism"
> any more as what I am advocating seems to be too easily confused with the
> failed version.
>
>
>>  I don't mean putting an extra module into the brain, I mean putting
>>>> the brain directly into the same configuration it is put into by
>>>> learning the language in the normal way.
>>>>
>>>
>>>      How might we do that? Alter 1 neuron and you might not have the same
>>> mind.
>>>
>> When you learn something, your brain physically changes. After a year
>> studying Chinese it goes from configuration SPK-E to configuration
>> SPK-E+C. If your brain were put directly into configuration SPK-E+C
>> then you would know Chinese and have a false memory of the year of
>> learning it.
>>
>
>     Ah, but is that change, from SPK-E to SPK-E+C, one that is numerable
> strictly in terms of a number of neurons changed? No. I would conjecture
> that it is a computational problem that is at least NP-hard. My reasoning
> is that if the change where emulable by a computation X *and* that X could
> also could be used to solve a P-hard problem, then there should exist an
> algorithm that could easily translate any statement in one language into
> another *and* finding that algorithm should require only some polynomial
> quantity of resources (relative to the number of possible algorithms). It
> should be easy to show that this is not the case.
>

I don't understand why you're focusing on NP-hard problems... NP-hard
problems are solvable algorithmically... but not efficiently. When I read
you (I'm surely misinterpreting), it seems like you're saying you can't
solve NP-hard problems... it's not the case,... but as your input grows,
the time to solve the problem may be bigger than the time ellapsed since
the bigbang. You could say that the NP-hard problems for most input are not
technically/practically sovable but they are in theories (you have the
algorithm) unlike undecidable problems like the halting problem.

Quentin


>     I strongly believe that computational complexity plays a huge role in
> many aspects of the hard problem of consciousness and that the Platonic
> approach to computer science is obscuring solutions as it is blind to
> questions of resource availability and distribution.
>
>  In a thought experiment we can say that the imitation stimulates the
>>>> surrounding neurons in the same way as the original. We can even say
>>>> that it does this miraculously. Would such a device *necessarily*
>>>> replicate the consciousness along with the neural impulses, or could
>>>> the two be separated?
>>>>
>>>
>>>      Is the brain strictly a classical system?
>>>
>> No, although the consensus appears to be that quantum effects are not
>> significant in its functioning. In any case, this does not invalidate
>> functionalism.
>>
>
>     Well, I don't follow the crowd. I agree that functionalist is not
> dependent on the type of physics of the system, but there is an issue of
> functional closure that must be met in my conjecture; there has to be some
> way for the system (that supports the conscious capacity) to be closed
> under the transformation involved.
>
>  As I said, technical problems with computers are not relevant to the
>>>> argument. The implant is just a device that has the correct timing of
>>>> neural impulses. Would it necessarily preserve consciousness?
>>>>
>>>>
>>>>       Let's see. If I ingest psychoactive substances, there is a 1p
>>> observable
>>> effect.... Is this a circumstance that is different in kind from that
>>> device?
>>>
>> The psychoactive substances cause a physical change in your brain and
>> thereby also a psychological change.
>>
>>
>>      Of course. As I see it, there is no brain change without a mind
> change and vice versa. The mind and brain are dual, as Boolean algebras and
> topological spaces are dual, the relation is an isomorphism between
> structures that have oppositely directed arrows of transformation. The math
> is very straight forward... People just have a hard time understanding the
> idea that all of "matter" is some form of topological space and there is no
> known calculus of variations for Boolean algebras (no one is looking for
> it, except for me, that I know of). Care to help me? The idea of SPK-E ->
> SPK-E+C, that you mentioned, is an example of a variation of Boolean
> algebra!
>
> --
> Onward!
>
> Stephen
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To post to this group, send email to 
> everything-list@googlegroups.**com<everything-list@googlegroups.com>
> .
> To unsubscribe from this group, send email to everything-list+unsubscribe@
> **googlegroups.com <everything-list%2bunsubscr...@googlegroups.com>.
> For more options, visit this group at http://groups.google.com/**
> group/everything-list?hl=en<http://groups.google.com/group/everything-list?hl=en>
> .
>
>


-- 
All those moments will be lost in time, like tears in rain.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to