On Wednesday, April 10, 2013 1:16:48 AM UTC-4, Brent wrote:
>
> On 4/9/2013 12:19 PM, Evgenii Rudnyi wrote: 
> > On 08.04.2013 11:38 Bruno Marchal said the following: 
> >> 
> >> On 07 Apr 2013, at 19:20, Evgenii Rudnyi wrote: 
> >> 
> >>> On 07.04.2013 19:12 meekerdb said the following: 
> >>>> On 4/6/2013 11:54 PM, Evgenii Rudnyi wrote: 
> >>>>> On 07.04.2013 02:40 Craig Weinberg said the following: 
> >>>>>> Ok, here's my modified version of Fig 11 
> >>>>>> 
> >>>>>> 
> http://multisenserealism.files.wordpress.com/2012/01/33ost_diagram.jpg 
> >>>>>> 
> >>>>> 
> >>>>> 
> >>>>>> 
> >>>>>> 
> > I believe that you have understood the paper wrong. The authors 
> >>>>> literally believe that the observed 3D world is geometrically 
> >>>>> speaking in the brain. 
> >>>> 
> >>>> Yes our 3d model of the world is in our minds (not our brains). 
> >>>> It's not "there" geometrically speaking. Geometry and "there" 
> >>>> are part of the model. Dog bites man. 
> >>> 
> >>> Well, if you look into the paper, you see that authors take it 
> >>> literally as in neuroscience mind means brain. Mind belongs to 
> >>> philosophy. 
> >> 
> >> 
> >> But mind is different from brain. And mind is part of both cognitive 
> >> science and theoretical computer science. To identify mind and brain 
> >> is possible in some strong non computationalist theories, but such 
> >> theories don't yet exist, and are only speculated about. To confuse 
> >> mind and brain, is like confusing literature and ink. Neurophilophers 
> >> are usually computationalist and weakly materialist, and so are 
> >> basically inconsistent. 
> > 
> > I guess, this is a way how science develops. Neuroscientists study brain 
> and they just 
> > take a priori from the materialist and reductionism paradigm that mind 
> must be in the 
> > brain. 
>
> The materialist view is just that the mind is a process in the brain, like 
> a computation 
> is the process of running a program in a computer. As processes they may 
> be abstracted 
> from their physical instantiation and are not anywhere, except maybe in 
> Platonia. 
>
> > After that, they write papers to bring this idea to the logical 
> conclusion. To this end, 
> > they seem to have two options. Either they should say that the 3D visual 
> world is 
> > illusion (I guess, Dennett goes this way) 
>
> I think "illusion" has too strong a connotation of fallacious. I think 
> "model" is more 
> accurate. So long as we realize the world we conceptualize is a model then 
> we are not 
> guilty of a fallacy. 
>

Models have no presence. The same model can be expressed in any sense 
modality, so that our ability to conceptualize models is not the same 
phenomenon as our ability to perceive and participate in the world. 
Modeling is based on equivalence, and equivalence is part of pattern 
recognition, so that in order to even conceive of a model, there first 
would have to be direct perception and participation in a real world. 
Caring about the world gives you a reason to care about modeling it. If 
your world is invisible, intangible, and unconscious, then no models are 
needed and all participation is better served by automatic algorithms.
 

>
> > or put phenomenological consciousness into the brain. 
>
> I don't know what this means. That phenomenological consciousness depends 
> on the brain is 
> empirically well established. 


That human consciousness is influenced by the brain is empirically well 
established. There is enough data from things like hydrocephalus, the 
recent psilocybin study, and NDEs to cast some doubt even on human-brain 
dependence in theory. Those exotic possibilities are not necessary however 
to see that there are a great variety of brainless species who nonetheless 
participate in the world in ways which seem more conscious than 
non-biological structures.

Think of it this way. If our brain produced phenomenal awareness, then the 
tissues of the brain would have to be responsible for that - the phenomenal 
consciousness of the brain would be dependent on the proto-phenomenal 
consicousness of neuronal sub-brains...otherwise consciousness appears out 
of nothing, for no particular reason, to live nowhere.

Craig
 

> But to "put it into" the brain implies making a spatial 
> placement of an abstract concept. 
>
>
>
> > Let us see what happens along this way. 
> > 
> > The paper in a way is well written. The only flaw (that actually is 
> irrelevant to the 
> > content of the paper) that I have seen in it, is THE ENTROPY. Biologists 
> like the 
> > entropy so much that they use it in any occasion. For example from the 
> paper: 
> > 
> > “Thus, changes in entropy provide an important window into 
> self-organization: a sudden 
> > increase of entropy just before the emergence of a new structure, 
> followed by brief 
> > period of negative entropy (or negentropy).” 
> > 
> > I have seen that this could be traced to Schrödinger’s What is Life?, 
> > reread his chapter on Order, Disorder and Entropy and made my comments 
> > 
> > http://blog.rudnyi.ru/2013/04/schrodinger-disorder-and-entropy.html 
>
>
> Still tilting at that windmill? 
>
> "A) From thermodynamic tables, the mole entropy of silver at standard 
> conditions S(Ag, cr) 
> = 42.55 J K-1 mol-1 is bigger than that of aluminum S(Al, cr) = 28.30 J 
> K-1 mol-1. Does it 
> mean that there is more disorder in silver as in aluminium?" 
>
> Yes, there is more disorder in the sense that raising the temperature of a 
> mole of Ag 1deg 
> increases the number of accessible conduction electron states available 
> more than does 
> raising the temperature of a mole of Al does. 
>
> I agree that disorder is not necessarily a good metaphor for entropy. But 
> dispersal of 
> energy isn't always intuitively equal to entropy either. Consider 
> dissolving ammonium 
> nitrate in water. The process is endothermic, so the temperature drops and 
> energy is 
> absorbed, but the process goes spontaneously because the entropy 
> increases; the are a lot 
> more microstates accessible in the solution even at the lower temperature. 
>
> Your quote of Arnheim makes me suspect that *he* is one who has confounded 
> our language. 
> Receiving information reduces uncertainty; it doesn't necessarily increase 
> order. Chaos 
> and unpredictability and information do not "carry a maximum of 
> information". What they do 
> is allow for a maximum increase of information when they are resolved. 
> Disorder doesn't 
> provide information - it provides the opportunity for using information, 
> just as ignorance 
> of what a message will be is a measure of how much information the message 
> will contain 
> when it removes the ignorance. 
>
> Brent 
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to