By that analogy, you are crediting the letter "e" for authoring Hamlet. Cart goes behind the horse. > > > Neurons deal in GABA and acetylcholine. I deal in paychecks and days >> off. Different levels of description. >> > > That is true they are different but they are both correct descriptions, > although the scales they deal with are quite different, and one is built on > top of the other; You can have GABA and acetylcholine without paychecks and > days off but unless you move to electronics you can't have paychecks and > days off without GABA and acetylcholine. > Even more reason that the dumb neurotransmitters need the high level teleology to inform them. You can have bricks without having the Taj Mahal, but you can't have the Taj Mahal without bricks. That doesn't mean that bricks or even bricklayers are responsible for the Taj Mahal. Top down personal motives can direct lower sub-personal activities. > > My neurons can influence my consciousness from a sub-personal level - >> say feeling unfulfilled when I get my paycheck, but they cannot decide for >> me to get a better job. >> > > One neuron can't do that, but 10^11 neurons with their 10^14 synapses can. > Not unless they can stop being a group of neurons and start being a person in the world. > > > I decide. Me. >> > > Correct, and "I" is a high level description of what 10^11 neurons with > 10^14 synapses do > Why would the high level description level be different from the low level description? > , just as "pressure" is a correct high level description of how 6.02*10^23 > molecules behave when they collide into the inner surface of a closed > chamber. > There is no unexpected gap between the behavior of those two levels of description. With subjectivity, the gap is infinite. There is no overlap at all between the color blue or the feeling of dizziness and any kind of functional configuration of computational outcomes. > > > My social experience >> > > Encoded by neurons as memories. > Could not apply in an encoded state. What you are saying suggests a subjectivity where every time I almost put my hand on a hot stove a specific memory is called up and decoded (decoded into what?). I don't think it works that way, although specific memories can be evoked, they are not necessary for moral conscience to condition us. > > > and and innate sensitivity >> > > Determined by the collective state of all those neurons and synapses > inside that bone box sitting on your shoulders. > There is nothing to collect the state other than me. I am the collector. > > > circumscribes these acts as criminal, evil, or both, >> > > OK. > > >> > I can meet someone and go into business with them. I can have an idea >> and make money from it. I can get run over by a furniture truck and collect >> an insurance settlement. >> > > Yes but I already knew that, I already knew that sometimes collections of > neurons decide to behave in that way. > Only because you are a person and are projecting personal experience onto neurons. There is nothing that we have seen in neurons to suggest that they know how to behave in any particular way beyond the expected cellular processes. > > > NONE of these possibilities are realizable on the sub-personal or >> super-signifying levels. >> > > But if I change the chemistry of some of your neurons you will decide not > to go into business with anyone because you will believe you will not make > money from it and you will also figure that it would not be a good idea to > file a insurance claim. > Absolutely. It goes both ways. We can control our sub-personal actions, but impersonal conditions can control us personally. It isn't a simple relationship because the impersonal has a different kind of power over the personal than the personal has over the impersonal. A single bullet can kill a leader, but a single idea can keep changing the behavior of people all over the world forever. Both illustrate how the universe makes sense. > > >Let's compare. Does your computer worry about it's job? >> > > I don't know for sure, I don't even know if you worry about your job > because all I can observe is behavior. I do have a theory that extrapolates > consciousness and emotion from behavior and I think it's a pretty good > theory but it's not proven and never will be, so I just do the best I can. > If you had to bet though. If it really mattered and there was a right answer and you had to pick yes or no that you computer worries about its job, could you honestly say that it is likely? > > > Does it get a feeling one way or another if it receives more or less >> volts? >> > > I don't know for sure, I don't even know if you get a feeling one way or > another if you receive a electric shock because all I can observe is > behavior. I do have a theory that extrapolates consciousness and emotion > from behavior and I think it's a pretty good theory but it's not proven and > never will be, so I just do the best I can. > I understand that, and it isn't wrong in a sense. What I see though is that voyeuristic formalism may have to be transcended in order for us to progress to the next level of scientific insight. If we are to understand consciousness, we have to do better than waiting to be told that computers aren't alive. We are going to have to understand intuition, sense, and states of awareness that go beyond logic. Craig > John K Clark > > > > -- You received this message because you are subscribed to the Google Groups "Everything List" group. To view this discussion on the web visit https://groups.google.com/d/msg/everything-list/-/Y3_Lc4GapzcJ. To post to this group, send email to email@example.com. To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.