2012/5/9 meekerdb <meeke...@verizon.net>

>  On 5/9/2012 1:11 PM, Quentin Anciaux wrote:
>
>
>
> 2012/5/9 meekerdb <meeke...@verizon.net>
>
>>   On 5/9/2012 12:09 PM, Quentin Anciaux wrote:
>>
>>
>>
>> 2012/5/9 meekerdb <meeke...@verizon.net>
>>
>>>   On 5/9/2012 11:43 AM, Quentin Anciaux wrote:
>>>
>>>
>>>
>>> 2012/5/9 meekerdb <meeke...@verizon.net>
>>>
>>>>   On 5/9/2012 2:30 AM, Quentin Anciaux wrote:
>>>>
>>>>
>>>>
>>>> 2012/5/9 meekerdb <meeke...@verizon.net>
>>>>
>>>>>  On 5/8/2012 4:24 PM, Stathis Papaioannou wrote:
>>>>>
>>>>> On Wed, May 9, 2012 at 5:52 AM, John Mikes <jami...@gmail.com> 
>>>>> <jami...@gmail.com> wrote:
>>>>>
>>>>>  Stathis: what's your definition? - JM
>>>>>
>>>>> On Sat, May 5, 2012 at 6:56 PM, Stathis Papaioannou <stath...@gmail.com> 
>>>>> <stath...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>  On Sat, May 5, 2012 at 10:46 PM, Evgenii Rudnyi <use...@rudnyi.ru> 
>>>>> <use...@rudnyi.ru> wrote:
>>>>>
>>>>>  I have started listening to Beginning of Infinity and joined the
>>>>> discussion
>>>>> list for the book. Right now there is a discussion there
>>>>>
>>>>> Free will in 
>>>>> MWIhttp://groups.google.com/group/beginning-of-infinity/t/b172f0e03d68bcc6
>>>>>
>>>>> I am at the beginning of the book and I do not know for sure, but from
>>>>> the
>>>>> answers to this discussion it seems that according to David Deutsch one
>>>>> can
>>>>> find free will in MWI.
>>>>>
>>>>>  One can find or not find free will anywhere depending on how one
>>>>> defines it. That is the entire issue with free will.
>>>>>
>>>>>  My definition: free will is when you're not sure you're going to do
>>>>> something until you've done it.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> So if carefully weigh my options and decide on one it's not free
>>>>> will?  I'd say free will is making any choice that is not coerced by
>>>>> another agent.
>>>>>
>>>>> Brent
>>>>>
>>>>
>>>> It's compatible with what Stathis said... unti you've made the actual
>>>> choise, you didn't do it and didn't know what it will be...  "the do
>>>> something of Stathis can be you're not sure what you'll choose until you've
>>>> chosen it."
>>>>
>>>>
>>>>  Are you saying that one *never* knows what they are going to do until
>>>> they do it...
>>>>
>>>
>>>  .You have some knowledge of what you'll do... but you can only really
>>> "know" retrospectively. Iow, you are your fastest simulator... if it was
>>> not the case it would be possible to implement a faster algorithm able to
>>> predict what you'll do before you even do it... that seems paradoxical.
>>>
>>>
>>>  I don't see anything paradoxical about it.  A computer that duplicated
>>> your brain's neural network, but used electrical or photonic signals
>>> (instead of electrochemical) would be orders of magnitude faster.
>>>
>>
>> It's paradoxical, because if it could, I could know the outcome, and if I
>> could know the outcome, then I can do something else, and If I do something
>> else, then the simulation of that superphotonic computer is wrong hence the
>> hypothesis that it could simulate my choice faster than me is impossible
>> (because if it could, it *must* take in account my future knowledge of my
>> choice, if it does not, it is no faster to simulate what I'll do than me).
>>
>>
>>  That's an incoherent paradox.  You've now assumed that not only is your
>> brain simulated, so your action is known in advance, but also that the
>> simulation information is fed back to your brain so it influences the
>> action.  That's changing the problem and essentially creating a
>> brain+simulator=brain'. The fact that brain'=/=brain is hardly paradoxical.
>>
>
> Hmm ok... I have to think it a little more.
>
>>
>>
>>
>>
>>> But this is has no effect on the compatibilist idea of free will (the
>>> kind of free will worth having).
>>>
>>>
>>>
>>>
>>>> which then by Stathis defintion means that every action is free will
>>>> and coercion is impossible?
>>>>
>>>
>>> Coercion limit your choices, not your will, you can still choose to die
>>> (if the choice was between your life and something else for example). You
>>> can always choose if you can think, it's not because the only available
>>> choices are bad, that your free will suddenly disapeared.
>>>
>>>
>>>  So would it be an unfree will if an external agent directly injected
>>> chemicals or electrical signals into your brain thereby causing a choice
>>> actually made by the external agent?
>>>
>>
>> yes
>>
>>
>>
>>  Why is it still "you" if your brain is hooked up to something that
>> allows an external agent to control your body?
>>
>>
> I said the contrary... You asked if it would be unfree... I answered
> "yes"  (it would be unfree in this case).
>
>
> OK, we agree on that.
>
>
>
>
>>
>>
>>
>>
>>> How is this different from an external agent directly injecting
>>> information via your senses causing and thereby causing a choice actually
>>> made by the agent?
>>>
>>>  In the first case *you* choose,
>>
>>
> But you said in the first case 'you' were unfree??
>
>
>     in the second case you don't.
>>
>>
>>  ?? That's the reverse of your previous post in which you held that an
>> external agent threatening you does not remove your
>>
>  free will.  You said it just limited your choices, you still chose.
>> Did you read my post correctly?
>>
>
> Yes I read it correctly. If you fed chemicals and electrical signal to my
> brain then I did not *choose*.
>
>
> That's the first case, but not the second.
>
>
>
> So in the case I'm coerced by an external agent by external means, I can
> still choose only the available choices are reduced (and all of them can be
> bad), If  it fed drugs/electrical signal that make me act like a puppet I
> can't choose.
>
> So in the first case (coerced by external means) I can choose and still
> have free will albeit having limited bad choices, in the second case (your
> thought experiment) I don't have free will.
>
>>
>> Brent
>>
>
> Maybe we need to number these:
>
> (1)"...an external agent directly injected chemicals or electrical signals
> into your brain thereby causing a choice actually made by the external
> agent."
>
>     To which you answered "Yes (that's unfree)." AND "...in the first case
> I can still choose."
>
> (2) "...an external agent directly injecting information *via your senses*..."
>
>
>     To which you answered "...in the second case I don't have free will."
>
> Yet (2) consists only of the external agent talking to you and threatening
> or cajoling.
>
> Brent
>

And here I made it clear that the first case was what I was talking about
first and the second case was your thought experiment (chemical/electrical
puppeting trick) that came after.

Whatever, there is no point repeating I answer I can still choose when I
was unfree, I did not say that.

>   --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To post to this group, send email to everything-list@googlegroups.com.
> To unsubscribe from this group, send email to
> everything-list+unsubscr...@googlegroups.com.
> For more options, visit this group at
> http://groups.google.com/group/everything-list?hl=en.
>



-- 
All those moments will be lost in time, like tears in rain.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to