On Wednesday, March 13, 2013 12:32:34 PM UTC-4, John Clark wrote:
>
> On Mon, Mar 11, 2013  Craig Weinberg <whats...@gmail.com <javascript:>>wrote:
>
> >> the phrase "dragons exist" or "God exists" is not gibberish just wrong, 
>>> and "free will" is not even wrong.  I'm saying that  if "free will" doesn't 
>>> exist and “free will” doesn't not exist then that’s just another way of 
>>> saying that "free will" is gibberish.
>>>
>>
>> > I'm not saying that free will doesn't exist though,
>>
>  
> I know, you’re not talking about something that does not exist, you’re 
> talking about something that is not deterministic and not not 
> deterministic. In other words you’re talking gibberish.  
>

It's mind boggling to me that you have no capacity to tolerate the obvious 
non-Aristotelian qualities of nature. The color white is not red, but since 
white cannot be made without using red wavelengths, then it can't be said 
that it is not not red either. Warm water can be said not to be hot but 
also not to be not hot either, in that it includes hot water mixed with 
cold. Where are you getting this fantasy expectation that everything fits 
into one box or the opposite box?
 

>  
>
>> > Why do you put free will in a different category from dragons or God
>>
>  
> Because both dragons and God are well defined concepts, just concepts that 
> don’t happen to have the attribute of existence. In contrast “free will” is 
> not only incoherently defined it is every bit as self contradictory as the 
> largest prime number is.
>

Free will doesn't need to be defined because it is inescapable and obvious. 
Color doesn't need to be defined either, or hunger, or itching.
 

>
> >> When the computer reaches its goal we know because when it reaches the 
>>> billionth digit of PI the machine will stop.
>>>
>>
>> > The machines stops because the programmer has programmed it to stop,
>>
>  
> Yes. So what?  Just exactly like you the program is the way it is for a 
> reason OR it is the way it is for no reason.
>

The reason that the machine stops has nothing to do with the goals of the 
machine. Your view has no way to accommodate the reality that meaning can 
be projected onto actions by the audience. To give a machine a full range 
of human emotions in your world is to simply use emoticons. A smiley face 
can't just be ASCII text, it must be a smile because how things look to you 
is how they must actually be.
  

>  
>
>> > not because the machine had a goal which was satisfied. There is a huge 
>> difference.
>>
>  
> If there is a huge difference it’s a bit odd that you are unable to 
> rationally describe even a tiny difference without just decreeing without 
> evidence or argument that certain things do or do not have subjective 
> states; and after all logically  investigating those states is the entire 
> point of the debate so your faith based assertions are not helpful. 
>

Do you believe that this>  ;-)   has an emotion? Does the computer have an 
emotion about it? Do the bits in RAM or pixels on the screen have a feeling 
about what ;-) means? Why not? 


> > I can catch a mouse in a trap and the mouse will stop moving.
>>
>
> True, and the mouse trap will stop moving too.
>

You could make one that resets itself. What's the difference?
 

>
> > That doesn't mean that the mouse has achieved some kind of personal 
>> mouse goal.
>>
>
> Also true, not every living thing successfully reaches its goal and not 
> everything even has a goal but the mouse trap certainly did, it was built 
> to move very fast and then stop if it was touched, and that is exactly what 
> happened.   
>

It could have been a child's finger broken in the trap instead. The trap 
would have broken it with exactly the same indifference. The mouse trap has 
no goal. 
 

>
> >> And the motion of your thumb on the joystick of the computer game you 
>>> were playing were sent into motion by the computer which will stop when it 
>>> reaches its goal, the end of the game.
>>>
>>
>> > I respond to the game voluntarily,
>>
>  
> So you responded the way you did for a reason, namely because you wanted 
> to. The computer game responds the way it does for a reason too.
>

'Because I wanted to' is the opposite of 'because it is programmed to'. The 
former intentionally creates and initiates a sequence of actions, the 
latter executes and acts as a consequence of unintentional following.

 
>
>> > the game responds to me unconsciously
>>
>  
> As I said the entire point of this conversation is to investigate what is 
> conscious and what is not, so for you to decree without evidence or 
> argument that this this and this is conscious but that that and that is not 
> just doesn’t get us very far.
>

Consciousness itself cannot be accessed by third person evidence. That 
doesn't mean that we have no access to valid intuition and judgment beyond 
the evidence of objects. That gets us as far as we need to get. There might 
be a way to conduct some useful experiments to prove whether or not people 
can unconsciously detect the presence of living organisms. I'd be in favor 
of that, but I don't need it to know exactly why machines built from the 
bottom up from human motives are different from organisms who grow from the 
inside out from their own motives.

 
>
>>  > people do often have control over their actions,
>>
>
> And people have control over their actions for a reason and so are 
> deterministic or they have control over their actions for no reason and so 
> are random, and if they have no control over their impulses to murder then 
> they should be treated more harshly not less than those that do because 
> they are far more dangerous.
>

What do you mean by "control over their impulses"? How does such a concept 
fit in with determinism?


> > we can and do regularly make criminal judgments toward determining just 
>> exactly the degree to which people are culpable for their own actions.
>>
>  
> 100%, or at least in a rational society that's what it should be.
>  
>
>> >> everybody should be responsible for their actions. 
>>>
>>
>> > That would be idiotic. It would mean that if someone knocks you 
>> unconscious and puts you in a Google car that runs someone over, then you 
>> are guilty of murder
>>
>  
> Speaking of idiotic […]
>  
>
>> >  why would punishing people (putting them in rooms by themselves?) 
>> deter other people?
>>
>  
> Because most other people don’t want to be put into rooms by themselves. I 
> can’t help but think you could have figured that out for yourself if you 
> tried real hard.
>

I'm trying to show you the absurdity of your position. Deterrence makes no 
sense to a machine. Punishing water from rolling down a window or a 
computer for crashing in the middle of your movie does no good. It doesn't 
matter whether the water or the computer are punished, because it can't 
make any computers or water "want" to avoid the same fate. Even if they 
could, without free will, their "want" isn't connected to anything that can 
cause changes in the universe.
 

>
> > Complexity may or may not be a symptom of sophistication, but it is the 
>> cause of nothing.
>>
>
>  Speaking of idiotic […]
>

Speaking of unsupported ad hominem remarks.
 

>
>  > Punishment as a deterrent relies absolutely on causally efficacious 
>> free will.
>>
>  
> Deterrence wouldn’t work if people actions were always non deterministic 
> (random)
>

It wouldn't work if people's actions were always deterministic either. 
Deterrence can *ONLY* work where people have some ability to voluntarily 
and intentionally control their own actions, in spite of deterministic or 
random influences.
 

> ; but as to "causally efficacious free will" I honestly don’t know because 
> I have no idea what that means. I don’t know what non-causally efficacious 
> free will is either.
>

Causally efficacious free will means the ability to take voluntary actions 
in the world that have real effects. Non-causally efficacious free will 
would be if you think that you are taking voluntary actions in the world, 
but actually you are paralyzed as far as anyone else is concerned. 

>  
>
>> > Yes, because we have free will. If we didn't have free will, then it 
>> would not matter whether you were fool or genius, alive or dead. What 
>> difference would it make to the universe?
>
>  
> Cannot comment, don’t know what ASCII sequence “free will” means.
>

Back to the old dodge..
 

>
> > that has nothing to do with the ontology of free will.
>>
>  
> I agree. If free will is gibberish then the ontology of free will is 
> gibberish too and NOTHING has anything to do with it or with anything else 
> for that matter. "Free will" means nothing and is good for nothing except 
> perhaps as a label for existential confusion.
>

Free will is so obvious that a four year old can understand it. There is no 
culture on Earth which fails to recognize the obvious and unavoidable 
reality of our own voluntary participation in the world.

Craig


>   John K Clark
>  
>   
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to