On 14 Oct 2013, at 17:46, Craig Weinberg wrote:

A first draft that I posted over the weekend.

I. Trailing Dovetail Argument (TDA)

A. Computationalism makes two ontological assumptions which have not been properly challenged:

The universality of recursive cardinality
Complexity driven novelty.
Both of these, I intend to show, are intrinsically related to consciousness in a non-obvious way.

B. Universal Recursive Cardinality

Mathematics, I suggest is defined by the assumption of universal cardinality: The universe is reducible to a multiplicity of discretely quantifiable units.


No mathematician will read more after this. Sorry Craig, but if you use standard terms, you need to follow the standard use. Mathematician avoid talking on the universe, at the start, and they do no more try to build foundations, still less on anything discrete.

Please avoid jargon, and use simpler terms. Avoid attributing nonsense to others. You make unclear what generations of thinkers have succeed in making clear.




The origin of cardinality, I suggest, is the partitioning or multiplication of a single, original unit, so that every subsequent unit is a recursive copy of the original.

Because recursiveness is assumed to be fundamental through math, the idea of a new ‘one’ is impossible. Every instance of one is a recurrence of the identical and self-same ‘one’, or an inevitable permutation derived from it. By overlooking the possibility of absolute uniqueness, computationalism must conceive of all events as local reproductions of stereotypes from a Platonic template rather than ‘true originals’.

A ‘true original’ is that which has no possible precedent. The number one would be a true original, but then all other integers represent multiple copies of one. All rational numbers represent partial copies of one. All prime numbers are still divisible by one, so not truly “prime”, but pseudo-prime in comparison to one. One, by contrast, is prime, relative to mathematics, but no number can be a true original since it is divisible and repeatable and therefore non- unique. A true original must be indivisible and unrepeatable, like an experience, or a person. Even an experience which is part of an experiential chain that is highly repetitive is, on some level unique in the history of the universe, unlike a mathematical expression such as 5 x 4 = 20, which is never any different than 5 x 4 = 20, regardless of the context.

I think that when we assert a universe of recursive recombinations that know no true originality, we should not disregard the fact that this strongly contradicts our intuitions about the proprietary nature of identity. A generic universe would seem to counterfactually predict a very low interest in qualities such as individuality and originality, and identification with trivial personal preferences. Of course, what we see the precise opposite, as all celebrity it propelled by some suggestion unrepeatability and the fine tuning of lifestyle choices is arguably the most prolific and successful feature of consumerism.

If the experienced universe were strictly an outcropping of a machine that by definition can create only trivially ‘new’ combinations of copies, why would those kinds of quantitatively recombined differences such as that between 456098209093457976534 and 45609420909345797353 seem insignificant to us, but the difference between a belt worn by Elvis and a copy of that belt to be demonstrably significant to many people?

C. Complexity Driven Novelty

Because computationalism assumes finite simplicity, that is, it provides only a pseudo-uniqueness by virtue of the relatively low statistical probability of large numbers overlapping each other precisely.


?



There is no irreducible originality to the original Mona Lisa, only the vastness of the physical painting’s microstructure prevents it from being exactly reproduced very easily. Such a perfect reproduction, under computationalism is indistinguishable from the original and therefore neither can be more original than the other (or if there are unavoidable differences due to uncertainty and incompleteness, they would be noise differences which we would be of no consequence).

This is where information theory departs from realism, since reality provides memories and evidence of which Mona Lisa is new and which one was painted by Leonardo da Vinci at the beginning of the 16th century in Florence, Italy, Earth, Sol, Milky Way Galaxy*.

Mathematics can be said to allow for the possibility of novelty only in one direction; that of higher complexity. New qualities, by computationalism, must arise on the event horizons of something like the Universal Dovetailer. If that is the case, it seems odd that the language of qualia is one of rich simplicity rather than cumbersome computables. With comp, there can be no new ‘one’, but in reality, every human experience is exactly that – a new day, a new experience, even if it often seems much like the one before. Numbers don’t work that way. Each mechanical result is identical. A = A. A does not ‘seem much like the A before, yet in a new way‘. This is a huge problem with mathematics and theoretical physics. They don’t get the connection between novelty and simplicity, so they hope to find it out in the vastness of super-human complexity.

II. Computation as Puppetry

I think that even David Chalmers, who I respect immensely for his contributions to philosophy of mind and in communicating the Hard Problem missed the a subtle but important distinction. The difference between a puppet and a zombie, while superficially innocuous, has profound implications for the formulation of a realistic critique of Strong AI. When Chalmers introduced or popularized the term zombie in reference to hypothetical perfect human duplicates which lack qualia and subjective experience, he inadvertently let an unscientific assumption leak in.

A zombie is supernatural because it implies the presence of an absence. It is an animated, un-dead cadaver in which a living person is no longer present. The unconsciousness of a puppet, however, is merely tautological – it is the natural absence of presence of consciousness which is the case with any symbolic representation of a character, such as a doll, cartoon, or emoticon. A symbolic representation, such as Bugs Bunny, can be mass produced using any suitable material substance or communication media. Even though Bugs is treated as a unique intellectual property, in reality, the title to that property is not unique and can be transferred, sold, shared, etc.

The reason that Intellectual Property law is such a problem is because anyone can take some ordinary piece of junk, put a Bugs Bunny picture on it, and sell more of it than they would have otherwise. Bugs can’t object to having his good name sullied by hack counterfeiters, so the image of Bugs Bunny is used both to falsely endorse an inferior product and to falsely impugn the reputation of a brand. The problem is, any reasonable facsimile of Bugs Bunny is just as authentic, in an Absolute sense, as any other. The only true original Bugs Bunny is the one we experience through our imagination and the imagination of Mel Blanc and the Looney Tunes animators.

The impulse to reify the legitimacy of intellectual property into law is related to the impulse to project agency and awareness onto machines. As a branch of the “pathetic fallacy” which takes literally those human qualities which have been applied to non- humans as figurative conveniences of language, the computationalistic fallacy projects an assumed character-hood on the machine as a whole. Reasoning (falsely, I think) that since all that our body can see of ourselves is a body, it is the body which is the original object from which the subject is produced through its functions. Such a conclusion, when we begin from mechanism, seems unavoidable at first.

III. Hypothesis

I propose that we reverse the two assumptions of mathematics above, so that

Recursion is assumed to be derived from primordial spontaneity rather than the other way around. Novelty can only be meaningful if it re-asserts simplicity in addition to complexity.This would mean: The expanding event horizon of the Universal Dovetailer would have to be composed of recordings of sensed experiences after the fact, rather than precursors to subjective simulation of the computation. Comp is untrue by virtue of diagonalization of immeasurable novelty against incompleteness. Sense out-incompletes arithmetic truth, and therefore leaves it frozen in stasis by comparison in every instant, and in eternity. Computation cannot animate anything except through susceptibility to the pathetic fallacy. This may seem like an unfair or insulting to the many great minds who have been pioneering AI theory and development, but that is not my intent. By assertively pointing out the need to move from a model of consciousness which hinges on simulated spontaneity to a model in which spontaneity can never, by definition be simulated, I am trying to express the importance and urgency of this shift. If I am right, the future of human understanding depends ultimately on our ability to graduate from the cul-de-sac of mechanistic supremacy to the more profound truth of rehabilitated animism. Feeling does compute because computation is how the masking of feeling into a localized unfeeling becomes possible.


IV. Reversing the Dovetailer

By uncovering the intrinsic antagonism between the above mathematical assumptions and the authentic nature of consciousness, it might be possible to ascertain a truer model of consciousness by reversing the order of the Universal Dovetailer (machine that builds the multiverse out of programs).

The universality of recursive cardinality reverses as the Diagonalization of the Unique
Complexity driven novelty can be reversed by Pushing the UD.
A. Diagonalization of the Unique

Under the hypothesis that computation lags behind experience*, no simulation of a brain can ever catch up to what a natural person can feel through that brain, since the natural person is constantly consuming the uniqueness of their experience before it can be measured by anything else. Since the uniqueness of subjectivity is immeasurable and unprecedented within its own inertial frame, no instrument from outside of that frame can capture it before it decoheres into cascades of increasingly generic public reflections.

PIP flips the presumption of Universal Recursive Cardinality inherent in mathematics so that all novelty exists as truly original simplicity, as well as a relatively new complex recombination, such that the continuum of novelty extends in both directions. This, if properly understood, should be a lightning bolt that recontextualizes the whole of mathematics. It is like discovering a new kind of negative number. Things like color and human feeling may exploit the addressing scheme that complex computation offers, but the important part of color or feeling is not in that address, but in the hyper-simplicity and absolute novelty that ‘now’ corresponds to that address. The incardinality of sense means that all feelings are more primitive than even the number one or the concept of singularity. They are rooted in the eternal ‘becoming of one’; before and after cardinality. Under PIP, computation is a public repetition of what is irreducibly unrepeatable and private. Computation can never get ahead of experience, because computation is an a posteriori measurement of it.

For example, a computer model of what an athlete will do on the field that is based on their past performance will always fail to account for the possibility that the next performance will be the first time that athlete does something that they never have done before and that they could not have done before. Natural identities (not characters, puppets, etc) are not only self-diagonalizing, natural identity itself is self-diagonalization. We are that which has not yet experienced the totality of its lifetime, and that incompleteness infuses our entire experience. The emergence of the unique always cheats prediction, since all prediction belongs to the measurements of an expired world which did not yet contain the next novelty.

B. Pushing the UD – If the UD is a program which pulls the experienced universe behind it as it extends, the computed realm, faster than light, ahead of local appearances. It assumes all phenomena are built bottom up from generic, interchangeable bits. The hypothesis under PIP is that if there were a UD, it would be pushed by experience from the top down, as well as recollecting fragments of previous experiences from the bottom up. Each experience decays from immeasurable private qualia that is unique into public reflections that are generic recombinations of fixed elements. Reversing the Dovetailer puts universality on the defense so that it becomes a storage device rather than a pseudo-primitive mechina ex deus.

The primacy of sense is corroborated by the intuition that every measure requires a ruler. Some example which is presented as an index for comparison. The uniqueness comes first, and the computability follows by imitation. The un-numbered Great War becomes World War II only in retrospect. The second war does not follow the rule of world wars, it creates the rule by virtue of its similarities. The second war is unprecedented in its own right, as an original second world war, but unlike the number two, it is not literally another World War I. In short, experiences do not follow from rules; rules follow from experience.

V. Conclusions

If we extrapolate the assumptions of Compuationalism out, I think that they would predict that the painting of the Mona Lisa is what always happens under the mathematical conditions posed by a combination of celestial motions, cells, bodies, brains, etc. There can be no truly original artwork, as all art works are inevitable under some computable probability, even if the the particular work is not predictable specifically by computation. Comp makes all originals derivatives of duplication. I suggest that it makes more sense that the primordial identity of sense experience is a fundamental originality from which duplication is derived. The number one is a generic copy – a one-ness which comments on an aspect of what is ultimately boundaryless inclusion rather than naming originality itself.

Under Multisense Realism (MSR), the sense-first view ultimately makes the most sense but it allows that the counter perspective, in which sense follows computation or physics, would appear to be true in another way, one which yields meaningful insights that could not be accessed otherwise.

When we shift our attention from the figure of comp in the background of sense to the figure of sense in the background of comp, the relation of originality shifts also. With sense first, true originality makes all computations into imposters. With computation first, arithmetic truth makes local appearances of originality artifacts of machine self-reference. Both are trivially true, but if the comp-first view were Absolutely true, there would be no plausible justification for such appearances of originality as qualitatively significant. A copy and an original should have no greater difference than a fifteenth copy and a sixteenth copy, and being the first person to discover America should have no more import than being the 1,588,237th person to discover America. The title of this post as 2013/10/13/2562 would be as good of a title as any other referenceable string.

*This is not to suggest that human experience lags behind neurological computation. MSR proposes a model called eigenmorphism to clarify the personal/sub-personal distinction in which neurological-level computation corresponds to sub-personal experience rather than personal level experience. This explains the disappearance of free will in neuroscientific experiments such as Libet, et. al. Human personhood is a simple but deep. Simultaneity is relative, and nowhere is that more true than along the continuum between the microphysical and the macrophenomenal. What can be experimented on publicly is, under MSR, a combination of near isomorphic and near contra-isomorphic to private experience.

To sum up:  unintelligible prose => non comp.

I can make sense of some image and assertions, here and there, but the overall message does not go through, and as the pretense is negative (those machines are puppets), I find this a bit annoying, like I find racist and prohibitionist overgeneralization annoying, especially when repeated.

Note that you are correct, on something, but it is only comp which helps me to realize this, making your negative allegation almost blasphemous on some spiritual level.

No machine/person can know or even believe being any machine, but that is not a logical reason to, counterintuitively, for sure, get the understanding why it has to be like that, for all machines. Somehow, you make also the confusion between []p and []p & p, all the time. Study comp and a bit of computer science, and dig deeper, before dismissing possible others.

Bruno


http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to