On 1/10/2014 11:29 PM, Stephen Paul King wrote:
"Hmm? Steven turns into a White Rabbit is not a /*logical*/ contradiction, it's a
/*nomological*/ one. If there's a transition from (t1,x1) to (t2,x2) it seems the only
/*logical*/ contradiction would be x2="Not x1 at t1." "Logical" is a very weak
condition; as far as I know it just means being consistent=(not every sentence is a
relating to or denoting certain principles, such as laws of nature, that
logically necessary nor theoretically explicable, but are simply taken as
Right! It was a very crude and informal explanation.
But it's not an explanation at all if it assumes regularities "such as laws of nature"
because those are what we're trying to explain.
Things become, hopefully, more clear when one considers the scenario where there are
many minds that are communicating/interacting while evolving. Interaction requires some
level of similarity between the participants.
For example, I I where to experience a White Rabbit, what effects would it have to
have on others that I interact with so that it would not effect their 1p content. I
would say that it was a hallucination, maybe... We forget that what we experience of the
world is not that world itself, it is our mind/brains version of such. We have to take
the capacity of hallucinations into account in our thoughts of that is a mind...
Can we not take as "true" what we experience? How can we know that it is not
some controlled simulation? We need to answer Descartes question: How do I know that I
am not just a brain in a vat (or a computation running in some UD)?
Sure, we can take our experiences as "true" in the sense that we know we have the
experience. Then we reason from there and hypothesize models of the world to make sense
of our experiences. But I think it makes no sense to speculate that I am brain in a vat -
what difference would it make? My model for understanding my experiences could only
include that by assuming much more about the vat, the stimulus to my brain, who made the
vat,....it's just a lot of extra complication with no predictive power.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email
To post to this group, send email to firstname.lastname@example.org.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.