On Thu, Apr 18, 2013 at 10:24 PM, Craig Weinberg <whatsons...@gmail.com> wrote:

>> It comes down to whether the computer has desires and feelings. We
>> can't be sure whether it does or not.
>
>
> Why would we even entertain the possibility that it does though? If
> computers had feelings wouldn't at least some of them complain about
> something or express some mood once in a while?

Does a water molecule? Does a protein?

>> We can't be sure whether a
>> bacterium has desires and feelings either. We are made of the same
>> stuff as the bacterium and we have desires and feelings, so something
>> that doesn't have desires and feelings can have desires and feelings
>> when it is arranged in a particular way. Whether it is deterministic
>> or random is, as you have said, orthogonal to this.
>
>
> I would give the benefit of the doubt that there is some degree of
> subjective content associated with bacteria on some level. The fact that
> they are arranged in different way is an obvious difference between bacteria
> and brains, but that is not the only difference. A human body has a
> different history than a bacterium as well. Different things happen when a
> human zygote divides than when a bacterium divides. You assume the cause is
> the configuration and the effect is the difference in behaviors and
> capacities. I consider the possibility that configuration reflects a
> different experience and that the cause and effect are bi-directional. The
> effect of experience may not be passed on in from one individual's body to
> another in a Lamarckian way, but that does not mean that there is not a
> conversation going on between two parallel aesthetics, one bottom-up
> unintentional and spatially local and one top-down intentional and
> temporally local (from a large now to a smaller now...i.e., when it is time
> for a particular shift, it begins to manifest in synchronous ways in
> multiple locations, like Newton and Leibniz).

Humans and bacteria are similar in some ways and different in others.
The differences could be as great as the differences between a modern
supercomputer and an AI of the future.

>> Our brains could be deterministic and we would still have the same
>> ideas about games, freedom of choice, moral responsibility and
>> everything else. You're unusual in finding it inconceivable.
>
>
> Why would we have any idea about 'choice' or 'freedom', or 'responsibility'?

Are you implying a general principle that if we can conceive of it it
must be so??

> Why would those things be conceivable without any way to step back from
> determinism voluntarily? Do you think a typewriter thinks about choice or
> freedom? Does a machine gun think about responsibility?

Those machines don't, but neither does a water molecule or a protein.

>> > All games are created equal, but games which have real world
>> > consequences
>> > are not games. This of course maps to the simulation argument - where
>> > all
>> > simulations are interchangeable with each other, but none of them are
>> > interchangeable with the fundamental non-simulation. Digital fire can
>> > burn
>> > down a simulated house in the game or a meta-simulatied house within a
>> > game
>> > within a simulated house, but it can never burn down a real house
>> > outside of
>> > all of the games. Games are easy, reality is harder.
>>
>> Unless simulated beings can have experiences.
>
>
> Like Bugs Bunny. Maybe he really enjoys the taste of carrots?

Again, you use facile counterarguments, like a race of electronic
beings claiming humans can't be conscious because water molecules and
proteins obviously aren't.

>> You are begging the
>> question by assuming that they cannot. You are saying that you know, a
>> priori, that we are not living in a simulation now, but you have to
>> explain how you know this.
>
>
> I don't know it, but I understand why consciousness cannot be simulated by
> something which is not inherently conscious (because of the Presentation
> problem...hard problems, explanatory gap, binding problem, symbol grounding
> problem, mind-body symmetry problem)

These problems, such as they are, do not preclude consciousness from
being simulated. If we could explain why one type of thing could not
possibly be conscious that would be a major step towards solving the
Hard Problem.

> and I understand why assembled bodies
> in space do not necessarily equal continuous experiences through time, and
> why, in general, maps are not territories. The only counter-argument I see
> is wishes, promises, and threats based on presumptions about consciousness
> defined from a 3p behaviorist perspective.

The map is not the same as the territory but this does does not
preclude the map from having properties of the territory.


-- 
Stathis Papaioannou

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to