Hi Bruno Marchal  

If you want to be the one who judges, who decides what 
is best or if it is logical or not, that's not trust, it's 
the way of the world.   Secularism.

The problem with secularism is that it cannot
help you in a time of suffering or sorrow.


Roger Clough, rclo...@verizon.net 
9/20/2012  
"Forever is a long time, especially near the end." -Woody Allen 


----- Receiving the following content -----  
From: Bruno Marchal  
Receiver: everything-list  
Time: 2012-09-20, 06:06:06 
Subject: Re: IMHO conscousness is an activity not a thing 


On 20 Sep 2012, at 11:45, Roger Clough wrote: 

> 
> 
> 
> BRUNO: I think that your metaphysics and reading of Leibniz makes  
> sense for me, and comp, but I have to say I don't follow your  
> methodology or teaching method on the religious field, as it  
> contains authoritative arguments. 
> 
> ROGER: Everything I write should be prefaced with IMHO. 
> 
> BRUNO: My feeling is that authoritative argument is the symptom of  
> those who lack faith. 
> 
> ROGER: That doesn't make sense, because faith= trust. And if you  
> don't trust, nothing is authoritative. 
> 
> BRUNO: That error is multiplied in the transfinite when an  
> authoritative argument is attributed to God. 
> 
> ROGER: Sorry, no comprehende. 

I can trust entities which provides explanations, not entities  
threatening with torture in case I do not love them. 
Humans have attributed to God authoritative arguments, with the result  
of justifying their own use of it. 
I can understand such argument in warfare, or when decision must be  
taken without the time to make a rational decision, but in the  
religious field, I think that authoritative argument have to fail,  
they only display the lack of faith of those who use them, or, more  
often, they display their special terrestrial interests. 




> 
> BRUNO: you answer the following question? 
> 
> How could anyone love a God, or a Goddess, threatening you of  
> eternal torture in case you don't love He or She? 
> 
> That's bizarre. 
> 
> How could even just an atom of sincerity reside in that love, with  
> such an explicit horrible threat? 
> 
> ROGER: That love and all love, comes from God, not from me. 

But then why God has to threaten his creature to get love from them?  
And again, how could that love be sincere? 
This does not make sense. 

Bruno 



> 
> BRUNO: I hope you don't mind my frankness and the naivety of my  
> questioning. 
> 
> Bruno 
> 
> ROGER: Not at all, as in my experience most agnosticism or atheism is 
> 
> is a product of ignorance, if you don't mind my saying that. :-) 
> 
> 
> Roger Clough, rclo...@verizon.net 
> 9/19/2012 
> "Forever is a long time, especially near the end." -Woody Allen 
> 
> 
> ----- Receiving the following content ----- 
> From: John Mikes 
> Receiver: everything-list 
> Time: 2012-09-18, 17:17:40 
> Subject: Re: IMHO conscousness is an activity not a thing 
> 
> 
> Ha ha: so not consciousness is the 'thing', but 'intelligence'? or  
> is this one also a function (of the brain towards the self?) who is  
> the self? how does the brain 
> DO something ? 
> (as a homunculus?) on its own? Any suggestions? 
> John M??? 
> 
> 
> On Tue, Sep 18, 2012 at 6:07 AM, Roger Clough wrote: 
> 
> Hi Craig Weinberg 
> 
> IMHO conscousness is not really anything in itself, 
> it is what the brain makes of its contents that the self 
> perceives. The self is intelligence, which is 
> able to focus all pertinent brain activity to a unified point. 
> 
> Roger Clough, rclo...@verizon.net 
> 9/18/2012 
> "Forever is a long time, especially near the end." 
> Woody Allen 
> 
> ----- Receiving the following content ----- 
> From: Craig Weinberg 
> Receiver: everything-list 
> Time: 2012-09-17, 23:43:08 
> Subject: Re: Zombieopolis Thought Experiment 
> 
> 
> 
> 
> On Monday, September 17, 2012 11:02:16 PM UTC-4, stathisp wrote: 
> On Tue, Sep 18, 2012 at 6:39 AM, Craig Weinberg ?rote: 
> 
>> I understand that, but it still assumes that there is a such thing  
>> as a set 
>> of functions which could be identified and reproduced that cause 
>> consciousness. I don't assume that, because consciousness isn't like 
>> anything else. It is the source of all functions and appearances,  
>> not the 
>> effect of them. Once you have consciousness in the universe, then  
>> it can be 
>> enhanced and altered in infinite ways, but none of them can replace  
>> the 
>> experience that is your own. 
> 
> No, the paper does *not* assume that there is a set of functions that 
> if reproduced will will cause consciousness. It assumes that something 
> like what you are saying is right. 
> 
> 
> By assume I mean the implicit assumptions which are unstated in the  
> paper. The thought experiment comes out of a paradox arising from  
> assumptions about qualia and the brain which are both false in my  
> view. I see the brain as the flattened qualia of human experience. 
> 
> 
> 
>>>>> This is the point of the thought experiment. The limitations of  
>>>>> all 
>>>>> forms of 
>>>>> measurement and perception preclude all possibility of there  
>>>>> ever being 
>>>>> a 
>>>>> such thing as an exhaustively complete set of third person  
>>>>> behaviors of 
>>>>> any 
>>>>> system. 
>>>>> 
>>>>> What is it that you don't think I understand? 
>>>> 
>>>> What you don't understand is that an exhaustively complete set of 
>>>> behaviours is not required. 
>>> 
>>> 
>>> Yes, it is. Not for prosthetic enhancements, or repairs to a nervous 
>>> system, but to replace a nervous system without replacing the  
>>> person who is 
>>> using it, yes, there is no set of behaviors which can ever be  
>>> exhaustive 
>>> enough in theory to accomplish that. You might be able to do it 
>>> biologically, but there is no reason to trust it unless and until  
>>> someone 
>>> can be walked off of their brain for a few weeks or months and  
>>> then walked 
>>> back on. 
>>> 
>>> 
>>> The replacement components need only be within the engineering  
>>> tolerance 
>>> of the nervous system components. This is a difficult task but it is 
>>> achievable in principle. 
>> 
>> 
>> You assume that consciousness can be replaced, but I understand  
>> exactly why 
>> it can't. You can believe that there is no difference between  
>> scooping out 
>> your brain stem and replacing it with a functional equivalent as  
>> long as it 
>> was well engineered, but to me it's a completely misguided notion. 
>> Consciousness doesn't exist on the outside of us. Engineering only  
>> deals 
>> with exteriors. If the universe were designed by engineers, there  
>> could be 
>> no consciousness. 
> 
> Yes, that is exactly what the paper assumes. Exactly that! 
> 
> 
> It still is modeling the experience of qualia as having a  
> quantitative relation with the ratio of brain to non-brain. That  
> isn't the only way to model it, and I use a different model. 
> 
> 
>>> I assume that my friends have not been replaced by robots. If they  
>>> have 
>>> been then that means the robots can almost perfectly replicate their 
>>> behaviour, since I (and people in general) am very good at picking  
>>> up even 
>>> tiny deviations from normal behaviour. The question then is, if  
>>> the function 
>>> of a human can be replicated this closely by a machine does that  
>>> mean the 
>>> consciousness can also be replicated? The answer is yes, since  
>>> otherwise we 
>>> would have the possibility of a person having radically different 
>>> experiences but behaving normally and being unaware that their  
>>> experiences 
>>> were different. 
>> 
>> 
>> The answer is no. A cartoon of Bugs Bunny has no experiences but  
>> behaves 
>> just like Bugs Bunny would if he had experiences. You are eating  
>> the menu. 
> 
> And if it were possible to replicate the behaviour without the 
> experiences - i.e. make a zombie - it would be possible to make a 
> partial zombie, which lacks some experiences but behaves normally and 
> doesn't realise that it lacks those experiences. Do you agree that 
> this is the implication? If not, where is the flaw in the reasoning? 
> 
> 
> The word zombie implies that you have an expectation of  
> consciousness but there isn't any. That is a fallacy from the start,  
> since there is not reason to expect a simulation to have any  
> experience at all. It's not a zombie, it's a puppet. 
> 
> A partial zombie is just someone who has brain damage, and yes if  
> you tried to replace enough of a person's brain with a non-  
> biological material, you would get brain damage, dementia, coma, and  
> death. 
> 
> Craig 
> 
> 
> 
> 
> --  
> Stathis Papaioannou 
> 
> --  
> You received this message because you are subscribed to the Google  
> Groups "Everything List" group. 
> To view this discussion on the web visit 
> https://groups.google.com/d/msg/everything-list/-/nrqkIqoR6xMJ  
> . 
> To post to this group, send email to everything-list@googlegroups.com. 
> To unsubscribe from this group, send email to 
> everything-list+unsubscr...@googlegroups.com  
> . 
> For more options, visit this group at 
> http://groups.google.com/group/everything-list?hl=en  
> . 
> 
> --  
> You received this message because you are subscribed to the Google  
> Groups "Everything List" group. 
> To post to this group, send email to everything-list@googlegroups.com. 
> To unsubscribe from this group, send email to 
> everything-list+unsubscr...@googlegroups.com  
> . 
> For more options, visit this group at 
> http://groups.google.com/group/everything-list?hl=en  
> . 
> 
> 
> 
> 
> 
> 
> --  
> You received this message because you are subscribed to the Google  
> Groups "Everything List" group. 
> To post to this group, send email to everything-list@googlegroups.com. 
> To unsubscribe from this group, send email to 
> everything-list+unsubscr...@googlegroups.com  
> . 
> For more options, visit this group at 
> http://groups.google.com/group/everything-list?hl=en  
> . 
> 
> 
> 
> --  
> You received this message because you are subscribed to the Google  
> Groups "Everything List" group. 
> To post to this group, send email to everything-list@googlegroups.com. 
> To unsubscribe from this group, send email to 
> everything-list+unsubscr...@googlegroups.com  
> . 
> For more options, visit this group at 
> http://groups.google.com/group/everything-list?hl=en  
> . 
> 
> 
> 
> http://iridia.ulb.ac.be/~marchal/ 
> 
> --  
> You received this message because you are subscribed to the Google  
> Groups "Everything List" group. 
> To post to this group, send email to everything-list@googlegroups.com. 
> To unsubscribe from this group, send email to 
> everything-list+unsubscr...@googlegroups.com  
> . 
> For more options, visit this group at 
> http://groups.google.com/group/everything-list?hl=en  
> . 
> 

http://iridia.ulb.ac.be/~marchal/ 



--  
You received this message because you are subscribed to the Google Groups 
"Everything List" group. 
To post to this group, send email to everything-list@googlegroups.com. 
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com. 
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to