On Thu, 5 Apr 2018 at 2:58 am, smitra <smi...@zonnet.nl> wrote:

> On 02-04-2018 17:27, Bruno Marchal wrote:
> >> On 1 Apr 2018, at 00:29, Lawrence Crowell
> >> <goldenfieldquaterni...@gmail.com> wrote:
> >>
> >> On Saturday, March 31, 2018 at 2:32:06 PM UTC-6, telmo_menezes
> >> wrote:
> >>
> >>> On Sat, Mar 31, 2018 at 10:17 PM, Lawrence Crowell
> >>> <goldenfield...@gmail.com> wrote:
> >>>> You would have to replicate then not only the dynamics of
> >>> neurons, but every
> >>>> biomolecule in the neurons, and don't forget about the
> >>> oligoastrocytes and
> >>>> other glial cells. Many enzymes for instance to multi-state
> >>> systems, say in
> >>>> a simple case where a single amino acid residue of
> >>> phosphorylated or
> >>>> unphosphorylated, and in effect are binary switching units. To
> >>> then make
> >>>> this work you now need to have the brain states mapped out down
> >>> to the
> >>>> molecular level, and further to have their combinatorial
> >>> relationships
> >>>> mapped. Biomolecules also behave in water, so you have to model
> >>> all the
> >>>> water molecules. Given the brain has around 10^{25} or a few
> >>> moles of
> >>>> molecules the number of possible combinations might be on the
> >>> order of
> >>>> 10^{10^{25}} this is a daunting task. Also your computer has to
> >>> accurately
> >>>> encode the dynamics of molecules -- down to the quantum
> >>> mechanics of their
> >>>> bonds.
> >>>>
> >>>> This is another way of saying that biological systems, even that
> >>> of a basic
> >>>> prokaryote, are beyond our current abilities to simulate. You
> >>> can't just
> >>>> hand wave away the enormous problems with just simulating a
> >>> bacillus, let
> >>>> alone something like the brain. Now of course one can do some
> >>> simulations to
> >>>> learn about the brain in a model system, but this is far from
> >>> mapping a
> >>>> brain and its conscious state into a computer.
> >>>
> >>> Well maybe, but this is just you guessing.
> >>> Nobody knows the necessary level of detail.
> >>>
> >>> Telmo.
> >>
> >> Take LSD or psilocybin mushrooms and what enters the brain are
> >> chemical compounds that interact with neural ligand gates. The
> >> effect is a change in the perception of consciousness. Then if we
> >> load coarse grained brain states into a computer that ignores lots
> >> of fine grained detail, will that result in something different?
> >> Hell yeah! The idea one could set up a computer neural network,
> >> upload some data file from a brain scan and that this would be a
> >> completely conscious person is frankly absurd.
> >
> > This means that you bet on a lower substitution level. I guess others
> > have already answered this. Note that the proof that physics is a
> > branch of arithmetic does not put any bound of the graining of the
> > substitution level. It could even be that your brain is the entire
> > universe described at the level of superstring theory, that will
> > change nothing in the conclusion of the reasoning. Yet it would be a
> > threat for evolution and biology as conceived today.
> >
> > Bruno
> >
> >> LC
> >>
> In experiments involving stimulation/inhibition of certain brain  parts
> using strong magnetic fields where people look for a few seconds at a
> screen with a large number of dots, it was found that significantly more
> people can correctly guess the number of dots when the field was
> switched on. The conclusion was that under normal circumstances when we
> are not aware of lower level information, such as the exact number of
> dots ion the screen, that information is actually present in the brain
> but we're not consciously aware of it. Certain people who have "savant
> syndrome" can be constantly aware of such lower level information.
> This then suggests to me that the substitution level can be taken at a
> much higher level than the level of neurons. In the MWI we would have to
> be imagined being spread out over sectors where information such as the
> number of dots on a screen is different. So, what you're not aware of
> isn't fixed for you, and therefore it cannot possibly define your
> identity.

Different physical states may lead to the same mental state until some
differentiating physical event occurs, and then the mental states diverge.
For example, the biological and the silicon version may have identical
experiences until they are exposed to a drug or to physical trauma. If, for
some reason, you were unhappy with this difference you could insist that
your brain replacement have further refinements so that it behaves closer
to the original.

> --
Stathis Papaioannou

You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at https://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to