On Monday, March 18, 2013 6:01:18 AM UTC-4, Bruno Marchal wrote:
>
>
> On 17 Mar 2013, at 17:02, Craig Weinberg wrote:
>
>
>
> On Sunday, March 17, 2013 10:47:05 AM UTC-4, Bruno Marchal wrote:
>>
>>
>> On 17 Mar 2013, at 03:47, Craig Weinberg wrote:
>>
>>
>>
>> On Saturday, March 16, 2013 3:15:43 PM UTC-4, Bruno Marchal wrote:
>>>
>>>
>>> On 15 Mar 2013, at 20:38, Craig Weinberg wrote:
>>>
>>>
>>>
>>> On Friday, March 15, 2013 3:04:24 PM UTC-4, Terren Suydam wrote:
>>>>
>>>> No, I think that you haven't understood it, 
>>>>
>>>
>>> That's because you are only working with a straw man of me. What is it 
>>> that you think that I don't understand? The legacy view is that if you have 
>>> many molecular systems working together mechanically, you will naturally 
>>> get emergent properties that could be mistaken for teleological entities. 
>>> You can't tell the difference between a brain change that seems meaningful 
>>> to you and a meaningful experience which causes a brain change. Just 
>>> because you feel like you are moving your arm doesn't mean that isn't just 
>>> a narrative fiction that serves a valuable evolutionary purpose.
>>>
>>> All of that is fine, in some other theoretical universe. In our universe 
>>> however, it can't work. There is no evolutionary purpose for consciousness 
>>> or narrative fictions. The existence of the feeling that you can control 
>>> your body makes no sense in universe where control is impersonal and 
>>> involuntary. There is no possibility for teleology to even be conceived in 
>>> a universe of endless meaningless chain reactions - no basis for 
>>> proprietary attachment of any kind. It's circular to imagine that it could 
>>> be important for an epiphenomenal self to believe it is phenomenal. 
>>> Important how? It's like adding a steering wheel to a mountain.
>>>  
>>>
>>>> due to whatever biases have led you to invest so much in your theory - 
>>>> a theory which is AFAICT completely unfalsifiable and predicts nothing.
>>>>
>>>
>>> No theory which models consciousness will ever be falsifiable, because 
>>> falsifiability is a quality within consciousness. As far as prediction 
>>> goes, one of the things it predicts that people who are bound to the 
>>> extremes of the philosophical spectrum will be intolerant and misrepresent 
>>> other perspectives. They will cling pathologically to unreal abstractions 
>>> while flatly denying ordinary experience.
>>>
>>>
>>> Materialism + computationalism can lead to nihilism. But 
>>> computationalism, per se,  does not deny ordinary experiences. It starts 
>>> from that, as it is a principle of invariance of consciousness for a 
>>> digital substitution made at some level.
>>>
>>
>> It may not deny ordinary experiences, but it doesn't support them 
>> rationally either. 
>>
>>
>> It supports them as much as possible. It supports some irrationalism like 
>> non communicable truth on the par of the machine.
>>
>
> Being non-communicable is a property of experience but non-communicability 
> itself doesn't imply experience at all. 
>
>
> You are right. But knowledge of a non communicable truth has to be 
> experienced, may be. 
>

In reality, I agree, because I think that is the symmetry: Phenomena are 
extended publicly on the outside, and intended privately on the inside - 
but that is multisense realism physics, not arithmetic. Arithmetic would 
have to provide a way to get to that quality theoretically. Why, as far as 
numbers are concerned, does privacy equate to "experience"? 

>
>
>
>
> Experience can imply  a use for computation, as a method of distributing 
> access to experiential qualities, but computation cannot imply a use for 
> experience. 
>
>
> That contradicts what machines already say when looking inward. It is not 
> the computation which is thinking, but the person supported by one (and 
> then an infinity of one).
> You deny the existence of that person, and I don't see why. Bringing 
> matter, time or indeterminacy does not help.
>

If machines all can be made to say the same thing when looking inward, then 
I don't think that they are having an experience. 


>
>
>
>
> As someone brought up on another conversation on FB, the construction of 
> neural networks coincides with the end of conscious involvement 
>
>
> If you decide so, it might as well lead to that. But this idea is based on 
> a confusion between syntax and semantics. Simple programs can have complex 
> semantics. Enough complex to be cautious about attribution or non 
> attribution of consciousness. 
>

I don't think it is a decision based on syntax and semantics at all, it is 
an observation about learning and memory. When we learn, we lose the 
necessity of conscious awareness of what we have learned, and at the same 
time, we observe that connections in our neural network or strengthened or 
extended.
 

>
>
>
>
> - the disappearance of personal attention into automatism. Learning makes 
> consciousness redundant. Repetition allows awareness to withdraw from the 
> act, which becomes robotic.
>
>
> No worry. Our environment should be enough rich to remind us that our 
> lives should not be taken for granted. 
>

Only because we are conscious to begin with, and our neural networks follow 
our experience. If we build a synthetic neural network, there is not 
experience to drive it but only the computations skeleton of which describe 
eunexperienced events.
 

>
>
>
>
>  
>
>>
>>
>>
>> What is a reason why computation would be processed as an ordinary 
>> experience, when we clearly can be accomplished through a-signifying 
>> mechanical activities?
>>
>>
>>
>> You lost me here. 
>>
>
> We see that generic mechanical activities can be used to imitate 
> experiences without actually embodying them.
>
>
>
> We can see that, but that's is only a partial view of truth. Even for 
> machine, we know that the syntactical description of the behavior of its 
> components does only give a pârtial description of what the machine is able 
> to know, without any external observer capable of guessing that truth. You 
> continue to treat the machines in a pre-Turing-Gödel way. You defend a 
> reductionist conception of machine, which does not fit what we already know 
> about them. 
>

What makes Turing-Gödel more convincing as a mechanism for experience 
rather than an imitator? To me, a computer is the very image of imitation - 
the universal purveyor of disconnected fragments and ungrounded illusion. 
The entire contents of the internet is pure human sense, all of the 
computers in the world contribute nothing to it except our accessibility to 
our own digitally mediated content. Computation is a medium with no message.


>
>
> Illuminated pixels can stimulate our consciousness to experience 
> characters and scenes which are not literally present in the pixels. The 
> pixel arrangements do not literally become people and places.
>
>
> A computation is something far more rich and subtle than any pixel 
> arrangements. 
>

Sure, but rich and subtle to our sensibilities. To the computation itself, 
I think that richness and subtlety are very likely meaningless.

Craig
 

>
> Bruno
>
>
>
>
>
> Craig
>  
>
>>
>> Bruno
>>
>>
>>
>>
>> Craig
>>  
>>
>>>
>>> Bruno
>>>
>>>
>>>
>>>
>>> Craig
>>>
>>>  
>>>>
>>>
>>>>
>>>>
>>>> On Fri, Mar 15, 2013 at 2:02 PM, Craig Weinberg <whats...@gmail.com>wrote:
>>>>
>>>>>
>>>>>
>>>>> On Friday, March 15, 2013 1:55:26 PM UTC-4, Terren Suydam wrote:
>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Mar 15, 2013 at 1:38 PM, Craig Weinberg 
>>>>>> <whats...@gmail.com>wrote:
>>>>>>
>>>>>>>
>>>>>>> Exactly. It is interesting also in that it seems to be like one of 
>>>>>>> those ambiguous images, in that as long as people are focused on one 
>>>>>>> fixed 
>>>>>>> idea of reality, they are honestly incapable of seeing any other, even 
>>>>>>> if 
>>>>>>> they themselves are sitting on top of it.
>>>>>>>
>>>>>>>
>>>>>> The irony in that statement is staggering. I couldn't satirize you 
>>>>>> any better if I tried. 
>>>>>>
>>>>>
>>>>> Why, do you think that I have never considered the bottom up model of 
>>>>> causation?
>>>>>
>>>>>
>>>>> -- 
>>>>> You received this message because you are subscribed to the Google 
>>>>> Groups "Everything List" group.
>>>>> To unsubscribe from this group and stop receiving emails from it, send 
>>>>> an email to everything-li...@googlegroups.com.
>>>>> To post to this group, send email to everyth...@googlegroups.com.
>>>>> Visit this group at 
>>>>> http://groups.google.com/group/everything-list?hl=en.
>>>>> For more options, visit https://groups.google.com/groups/opt_out.
>>>>>  
>>>>>  
>>>>>
>>>>
>>>>
>>> -- 
>>> You received this message because you are subscribed to the Google 
>>> Groups "Everything List" group.
>>> To unsubscribe from this group and stop receiving emails from it, send 
>>> an email to everything-li...@googlegroups.com.
>>> To post to this group, send email to everyth...@googlegroups.com.
>>> Visit this group at http://groups.google.com/group/everything-list?hl=en
>>> .
>>> For more options, visit https://groups.google.com/groups/opt_out.
>>>  
>>>  
>>>
>>>
>>> http://iridia.ulb.ac.be/~marchal/
>>>
>>>
>>>
>>>
>> -- 
>> You received this message because you are subscribed to the Google Groups 
>> "Everything List" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to everything-li...@googlegroups.com.
>> To post to this group, send email to everyth...@googlegroups.com.
>> Visit this group at http://groups.google.com/group/everything-list?hl=en.
>> For more options, visit https://groups.google.com/groups/opt_out.
>>  
>>  
>>
>>
>> http://iridia.ulb.ac.be/~marchal/
>>
>>
>>
>>
> -- 
> You received this message because you are subscribed to the Google Groups 
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to everything-li...@googlegroups.com <javascript:>.
> To post to this group, send email to everyth...@googlegroups.com<javascript:>
> .
> Visit this group at http://groups.google.com/group/everything-list?hl=en.
> For more options, visit https://groups.google.com/groups/opt_out.
>  
>  
>
>
> http://iridia.ulb.ac.be/~marchal/
>
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to