Re: Trailing Dovetailer Argument

2013-10-14 Thread Craig Weinberg


On Monday, October 14, 2013 6:18:17 PM UTC-4, Liz R wrote:
>
> On 15 October 2013 08:03, Craig Weinberg 
> > wrote:
>
>>
>> On Monday, October 14, 2013 12:46:34 PM UTC-4, Bruno Marchal wrote:
>>>
>>>
>>> On 14 Oct 2013, at 17:46, Craig Weinberg wrote:
>>>
>>> A first draft that I posted over the weekend. *
>>> *
>>>
>>> *I. Trailing Dovetail Argument (TDA)*
>>>
>>> *A. Computationalism makes two ontological assumptions which have not 
>>> been properly challenged:*
>>>
>>>- *The universality of recursive cardinality*
>>>- *Complexity driven novelty*.
>>>
>>> Both of these, I intend to show, are intrinsically related to 
>>> consciousness in a non-obvious way.
>>>
>>> *B. Universal Recursive Cardinality*
>>>
>>> Mathematics, I suggest is defined by the assumption of universal 
>>> cardinality: The universe is reducible to a multiplicity of discretely 
>>> quantifiable units.
>>>
>>> No mathematician will  read more after this. Sorry Craig, but if you use 
>>> standard terms, you need to follow the standard use. 
>>>
>>
>> I'm not trying to use standard terms, I'm trying to use precise terms. My 
>> intention is to give them a new and unprecedented use, just as anyone 
>> breaking new ground uses language in a new way.
>>
>
> They laughed at Copernicus ... they laughed at Stephenson ... they laughed 
> at Edison ...
>
> Seriously though, it might be a good idea to invent new terms if you need 
> them, rather than using existing ones in a new and potentially confusing 
> way.
>
> (Mind you I get confused with the existing ones used in the normal way, 
> but that's another story.)
>

Thanks, I appreciate that, but when I invent new terms, people complain 
that I am making up words. I don't think there is any way to introduce 
radically new concepts without irritating those who identify personally 
with their mastery of the old concepts.

Since I'm commenting on computationalism it seems like it should be fairly 
clear that I am using terms analytically rather than narrowly. I'm pointing 
out what computationalism does not explicitly say, but what its implicit 
assumptions are.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.


Re: Trailing Dovetailer Argument

2013-10-14 Thread LizR
On 15 October 2013 08:03, Craig Weinberg  wrote:

>
> On Monday, October 14, 2013 12:46:34 PM UTC-4, Bruno Marchal wrote:
>>
>>
>> On 14 Oct 2013, at 17:46, Craig Weinberg wrote:
>>
>> A first draft that I posted over the weekend. *
>> *
>>
>> *I. Trailing Dovetail Argument (TDA)*
>>
>> *A. Computationalism makes two ontological assumptions which have not
>> been properly challenged:*
>>
>>- *The universality of recursive cardinality*
>>- *Complexity driven novelty*.
>>
>> Both of these, I intend to show, are intrinsically related to
>> consciousness in a non-obvious way.
>>
>> *B. Universal Recursive Cardinality*
>>
>> Mathematics, I suggest is defined by the assumption of universal
>> cardinality: The universe is reducible to a multiplicity of discretely
>> quantifiable units.
>>
>> No mathematician will  read more after this. Sorry Craig, but if you use
>> standard terms, you need to follow the standard use.
>>
>
> I'm not trying to use standard terms, I'm trying to use precise terms. My
> intention is to give them a new and unprecedented use, just as anyone
> breaking new ground uses language in a new way.
>

They laughed at Copernicus ... they laughed at Stephenson ... they laughed
at Edison ...

Seriously though, it might be a good idea to invent new terms if you need
them, rather than using existing ones in a new and potentially confusing
way.

(Mind you I get confused with the existing ones used in the normal way, but
that's another story.)

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.


Re: Trailing Dovetailer Argument

2013-10-14 Thread Craig Weinberg


On Monday, October 14, 2013 12:46:34 PM UTC-4, Bruno Marchal wrote:
>
>
> On 14 Oct 2013, at 17:46, Craig Weinberg wrote:
>
> A first draft that I posted over the weekend. *
> *
>
> *I. Trailing Dovetail Argument (TDA)*
>
> *A. Computationalism makes two ontological assumptions which have not 
> been properly challenged:*
>
>- *The universality of recursive cardinality*
>- *Complexity driven novelty*.
>
> Both of these, I intend to show, are intrinsically related to 
> consciousness in a non-obvious way.
>
> *B. Universal Recursive Cardinality*
>
> Mathematics, I suggest is defined by the assumption of universal 
> cardinality: The universe is reducible to a multiplicity of discretely 
> quantifiable units. 
>
>
> No mathematician will  read more after this. Sorry Craig, but if you use 
> standard terms, you need to follow the standard use. 
>

I'm not trying to use standard terms, I'm trying to use precise terms. My 
intention is to give them a new and unprecedented use, just as anyone 
breaking new ground uses language in a new way.
 

> Mathematician avoid talking on the universe, at the start, and they do no 
> more try to build foundations, still less on anything discrete. 
>

It makes sense that mathematicians would avoid talking about the universe, 
but that might be because they are afraid of exposing the limitations of 
the purely mathematical approach. I'm not afraid though.


> Please avoid jargon, and use simpler terms. Avoid attributing nonsense to 
> others. You make unclear what generations of thinkers have succeed in 
> making clear. 
>

Something can be made to seem clear by only talking to people who think the 
same way. Ptolemaic astronomy was clear, until Galileo and Copernicus made 
it unclear.


>
>
>
> The origin of cardinality, I suggest, is the partitioning or 
> multiplication of a single, original unit, so that every subsequent unit is 
> a recursive copy of the original.
>
> Because recursiveness is assumed to be fundamental through math, the idea 
> of a new ‘one’ is impossible. Every instance of one is a recurrence of the 
> identical and self-same ‘one’, or an inevitable permutation derived from 
> it. By overlooking the possibility of absolute uniqueness, computationalism 
> must conceive of all events as local reproductions of stereotypes from a 
> Platonic template rather than ‘true originals’.
>
> A ‘true original’ is that which has no possible precedent. The number one 
> would be a true original, but then all other integers represent multiple 
> copies of one. All rational numbers represent partial copies of one. All 
> prime numbers are still divisible by one, so not truly “prime”, but 
> pseudo-prime in comparison to one. One, by contrast, is prime, relative to 
> mathematics, but no number can be a true original since it is divisible and 
> repeatable and therefore non-unique. A true original must be indivisible 
> and unrepeatable, like an experience, or a person. Even an experience which 
> is part of an experiential chain that is highly repetitive is, on some 
> level unique in the history of the universe, unlike a mathematical 
> expression such as 5 x 4 = 20, which is never any different than 5 x 4 = 
> 20, regardless of the context.
>
> I think that when we assert a universe of recursive recombinations that 
> know no true originality, we should not disregard the fact that this 
> strongly contradicts our intuitions about the proprietary nature of 
> identity.  A generic universe would seem to counterfactually predict a very 
> low interest in qualities such as individuality and originality, and 
> identification with trivial personal preferences. Of course, what we see 
> the precise opposite, as all celebrity it propelled by some suggestion 
> unrepeatability and the fine tuning of lifestyle choices is arguably the 
> most prolific and successful feature of consumerism.
>
> If the experienced universe were strictly an outcropping of a machine that 
> by definition can create only trivially ‘new’ combinations of copies, why 
> would those kinds of quantitatively recombined differences such as that 
> between 456098209093457976534 and 45609420909345797353 seem insignificant 
> to us, but the difference between a belt worn by Elvis and a copy of that 
> belt to be demonstrably significant to many people?
>
> *C. Complexity Driven Novelty*
>
> Because computationalism assumes *finite* simplicity,  that is, it 
> provides only a pseudo-uniqueness by virtue of the relatively low 
> statistical probability of large numbers overlapping each other precisely. 
>
>
> ?
>

Meaning that it can only say that an identical twin is unlikely, not that 
it is impossible because individuality is inherently unique.
 

>
>
>
> There is no irreducible originality to the original Mona Lisa, only the 
> vastness of the physical painting’s microstructure prevents it from being 
> exactly reproduced very easily.  Such a perfect reproduction, under 
> computationalism is indis

Re: Trailing Dovetailer Argument

2013-10-14 Thread Craig Weinberg


On Monday, October 14, 2013 12:13:42 PM UTC-4, JohnM wrote:
>
> Craig: beutiful. I saved it for my closer understanding (if...). 
> One little intrusion though:
>
> *you write: the first copy of something should not be different from the 
> 15,347,498th copy (figure arbitrary)*.
>  My 'agnosticism' objects:
> The first copy is restricted to the techniques applicable for copting, not 
> necessarily including the 'totality' of the original (infinite complexity). 
> The later copies copy the first one. 
> Meaning: we CANNOT copy in toto, only in our human cpabilities. 
> (I extend such restriction to *'analytical'* - restricted to KNOWN parts, 
> to *'statistical'* dependent on the border-limits and the qualia we 
> include in identifying the counted items, to *'probability' *and some 
> more.)
>

Thanks John,

Right. If I get what you are saying, I should have written 'the second copy 
of something should not be different...', since it is in the first instance 
of copy which most of the original is amputated. Once we record a musical 
performance, for example, everything that can be lost in recording has 
already been lost on the first go, so that each digital copy will be as 
good as the first. The recording process used to make the first 'copy' is 
the one that is different from all copies of that master recording.

Craig
 

>
> John Mikes
>
>
>
> On Mon, Oct 14, 2013 at 11:46 AM, Craig Weinberg 
> 
> > wrote:
>
>> A first draft that I posted over the weekend. *
>> *
>>
>> *I. Trailing Dovetail Argument (TDA)*
>>
>> *A. Computationalism makes two ontological assumptions which have not 
>> been properly challenged:*
>>
>>- *The universality of recursive cardinality*
>>- *Complexity driven novelty*.
>>
>> Both of these, I intend to show, are intrinsically related to 
>> consciousness in a non-obvious way.
>>
>> *B. Universal Recursive Cardinality*
>>
>> Mathematics, I suggest is defined by the assumption of universal 
>> cardinality: The universe is reducible to a multiplicity of discretely 
>> quantifiable units. The origin of cardinality, I suggest, is the 
>> partitioning or multiplication of a single, original unit, so that every 
>> subsequent unit is a recursive copy of the original.
>>
>> Because recursiveness is assumed to be fundamental through math, the idea 
>> of a new ‘one’ is impossible. Every instance of one is a recurrence of the 
>> identical and self-same ‘one’, or an inevitable permutation derived from 
>> it. By overlooking the possibility of absolute uniqueness, computationalism 
>> must conceive of all events as local reproductions of stereotypes from a 
>> Platonic template rather than ‘true originals’.
>>
>> A ‘true original’ is that which has no possible precedent. The number one 
>> would be a true original, but then all other integers represent multiple 
>> copies of one. All rational numbers represent partial copies of one. All 
>> prime numbers are still divisible by one, so not truly “prime”, but 
>> pseudo-prime in comparison to one. One, by contrast, is prime, relative to 
>> mathematics, but no number can be a true original since it is divisible and 
>> repeatable and therefore non-unique. A true original must be indivisible 
>> and unrepeatable, like an experience, or a person. Even an experience which 
>> is part of an experiential chain that is highly repetitive is, on some 
>> level unique in the history of the universe, unlike a mathematical 
>> expression such as 5 x 4 = 20, which is never any different than 5 x 4 = 
>> 20, regardless of the context.
>>
>> I think that when we assert a universe of recursive recombinations that 
>> know no true originality, we should not disregard the fact that this 
>> strongly contradicts our intuitions about the proprietary nature of 
>> identity.  A generic universe would seem to counterfactually predict a very 
>> low interest in qualities such as individuality and originality, and 
>> identification with trivial personal preferences. Of course, what we see 
>> the precise opposite, as all celebrity it propelled by some suggestion 
>> unrepeatability and the fine tuning of lifestyle choices is arguably the 
>> most prolific and successful feature of consumerism.
>>
>> If the experienced universe were strictly an outcropping of a machine 
>> that by definition can create only trivially ‘new’ combinations of copies, 
>> why would those kinds of quantitatively recombined differences such as that 
>> between 456098209093457976534 and 45609420909345797353 seem insignificant 
>> to us, but the difference between a belt worn by Elvis and a copy of that 
>> belt to be demonstrably significant to many people?
>>
>> *C. Complexity Driven Novelty*
>>
>> Because computationalism assumes *finite* simplicity,  that is, it 
>> provides only a pseudo-uniqueness by virtue of the relatively low 
>> statistical probability of large numbers overlapping each other precisely. 
>> There is no irreducible originality to the original Mona Lisa, on

Re: Trailing Dovetailer Argument

2013-10-14 Thread Bruno Marchal


On 14 Oct 2013, at 17:46, Craig Weinberg wrote:


A first draft that I posted over the weekend.

I. Trailing Dovetail Argument (TDA)

A. Computationalism makes two ontological assumptions which have not  
been properly challenged:


The universality of recursive cardinality
Complexity driven novelty.
Both of these, I intend to show, are intrinsically related to  
consciousness in a non-obvious way.


B. Universal Recursive Cardinality

Mathematics, I suggest is defined by the assumption of universal  
cardinality: The universe is reducible to a multiplicity of  
discretely quantifiable units.




No mathematician will  read more after this. Sorry Craig, but if you  
use standard terms, you need to follow the standard use. Mathematician  
avoid talking on the universe, at the start, and they do no more try  
to build foundations, still less on anything discrete.


Please avoid jargon, and use simpler terms. Avoid attributing nonsense  
to others. You make unclear what generations of thinkers have succeed  
in making clear.





The origin of cardinality, I suggest, is the partitioning or  
multiplication of a single, original unit, so that every subsequent  
unit is a recursive copy of the original.


Because recursiveness is assumed to be fundamental through math, the  
idea of a new ‘one’ is impossible. Every instance of one is a  
recurrence of the identical and self-same ‘one’, or an inevitable  
permutation derived from it. By overlooking the possibility of  
absolute uniqueness, computationalism must conceive of all events as  
local reproductions of stereotypes from a Platonic template rather  
than ‘true originals’.


A ‘true original’ is that which has no possible precedent. The  
number one would be a true original, but then all other integers  
represent multiple copies of one. All rational numbers represent  
partial copies of one. All prime numbers are still divisible by one,  
so not truly “prime”, but pseudo-prime in comparison to one. One, by  
contrast, is prime, relative to mathematics, but no number can be a  
true original since it is divisible and repeatable and therefore non- 
unique. A true original must be indivisible and unrepeatable, like  
an experience, or a person. Even an experience which is part of an  
experiential chain that is highly repetitive is, on some level  
unique in the history of the universe, unlike a mathematical  
expression such as 5 x 4 = 20, which is never any different than 5 x  
4 = 20, regardless of the context.


I think that when we assert a universe of recursive recombinations  
that know no true originality, we should not disregard the fact that  
this strongly contradicts our intuitions about the proprietary  
nature of identity.  A generic universe would seem to  
counterfactually predict a very low interest in qualities such as  
individuality and originality, and identification with trivial  
personal preferences. Of course, what we see the precise opposite,  
as all celebrity it propelled by some suggestion unrepeatability and  
the fine tuning of lifestyle choices is arguably the most prolific  
and successful feature of consumerism.


If the experienced universe were strictly an outcropping of a  
machine that by definition can create only trivially ‘new’  
combinations of copies, why would those kinds of quantitatively  
recombined differences such as that between 456098209093457976534  
and 45609420909345797353 seem insignificant to us, but the  
difference between a belt worn by Elvis and a copy of that belt to  
be demonstrably significant to many people?


C. Complexity Driven Novelty

Because computationalism assumes finite simplicity,  that is, it  
provides only a pseudo-uniqueness by virtue of the relatively low  
statistical probability of large numbers overlapping each other  
precisely.




?



There is no irreducible originality to the original Mona Lisa, only  
the vastness of the physical painting’s microstructure prevents it  
from being exactly reproduced very easily.  Such a perfect  
reproduction, under computationalism is indistinguishable from the  
original and therefore neither can be more original than the other  
(or if there are unavoidable differences due to uncertainty and  
incompleteness, they would be noise differences which we would be of  
no consequence).


This is where information theory departs from realism, since reality  
provides memories and evidence of which Mona Lisa is new and which  
one was painted by Leonardo da Vinci at the beginning of the 16th  
century in Florence, Italy, Earth, Sol, Milky Way Galaxy*.


Mathematics can be said to allow for the possibility of novelty only  
in one direction; that of higher complexity. New qualities, by  
computationalism, must arise on the event horizons of something like  
the Universal Dovetailer. If that is the case, it seems odd that the  
language of qualia is one of rich simplicity rather than cumbersome  
computables. With comp, there can be no new ‘on

Re: Trailing Dovetailer Argument

2013-10-14 Thread John Mikes
Craig: beutiful. I saved it for my closer understanding (if...).
One little intrusion though:

*you write: the first copy of something should not be different from the
15,347,498th copy (figure arbitrary)*.
 My 'agnosticism' objects:
The first copy is restricted to the techniques applicable for copting, not
necessarily including the 'totality' of the original (infinite complexity).
The later copies copy the first one.
Meaning: we CANNOT copy in toto, only in our human cpabilities.
(I extend such restriction to *'analytical'* - restricted to KNOWN parts,
to *'statistical'* dependent on the border-limits and the qualia we include
in identifying the counted items, to *'probability' *and some more.)

John Mikes



On Mon, Oct 14, 2013 at 11:46 AM, Craig Weinberg wrote:

> A first draft that I posted over the weekend. *
> *
>
> *I. Trailing Dovetail Argument (TDA)*
>
> *A. Computationalism makes two ontological assumptions which have not
> been properly challenged:*
>
>- *The universality of recursive cardinality*
>- *Complexity driven novelty*.
>
> Both of these, I intend to show, are intrinsically related to
> consciousness in a non-obvious way.
>
> *B. Universal Recursive Cardinality*
>
> Mathematics, I suggest is defined by the assumption of universal
> cardinality: The universe is reducible to a multiplicity of discretely
> quantifiable units. The origin of cardinality, I suggest, is the
> partitioning or multiplication of a single, original unit, so that every
> subsequent unit is a recursive copy of the original.
>
> Because recursiveness is assumed to be fundamental through math, the idea
> of a new ‘one’ is impossible. Every instance of one is a recurrence of the
> identical and self-same ‘one’, or an inevitable permutation derived from
> it. By overlooking the possibility of absolute uniqueness, computationalism
> must conceive of all events as local reproductions of stereotypes from a
> Platonic template rather than ‘true originals’.
>
> A ‘true original’ is that which has no possible precedent. The number one
> would be a true original, but then all other integers represent multiple
> copies of one. All rational numbers represent partial copies of one. All
> prime numbers are still divisible by one, so not truly “prime”, but
> pseudo-prime in comparison to one. One, by contrast, is prime, relative to
> mathematics, but no number can be a true original since it is divisible and
> repeatable and therefore non-unique. A true original must be indivisible
> and unrepeatable, like an experience, or a person. Even an experience which
> is part of an experiential chain that is highly repetitive is, on some
> level unique in the history of the universe, unlike a mathematical
> expression such as 5 x 4 = 20, which is never any different than 5 x 4 =
> 20, regardless of the context.
>
> I think that when we assert a universe of recursive recombinations that
> know no true originality, we should not disregard the fact that this
> strongly contradicts our intuitions about the proprietary nature of
> identity.  A generic universe would seem to counterfactually predict a very
> low interest in qualities such as individuality and originality, and
> identification with trivial personal preferences. Of course, what we see
> the precise opposite, as all celebrity it propelled by some suggestion
> unrepeatability and the fine tuning of lifestyle choices is arguably the
> most prolific and successful feature of consumerism.
>
> If the experienced universe were strictly an outcropping of a machine that
> by definition can create only trivially ‘new’ combinations of copies, why
> would those kinds of quantitatively recombined differences such as that
> between 456098209093457976534 and 45609420909345797353 seem insignificant
> to us, but the difference between a belt worn by Elvis and a copy of that
> belt to be demonstrably significant to many people?
>
> *C. Complexity Driven Novelty*
>
> Because computationalism assumes *finite* simplicity,  that is, it
> provides only a pseudo-uniqueness by virtue of the relatively low
> statistical probability of large numbers overlapping each other precisely.
> There is no irreducible originality to the original Mona Lisa, only the
> vastness of the physical painting’s microstructure prevents it from being
> exactly reproduced very easily.  Such a perfect reproduction, under
> computationalism is indistinguishable from the original and therefore
> neither can be more original than the other (or if there are unavoidable
> differences due to uncertainty and incompleteness, they would be noise
> differences which we would be of no consequence).
>
> *This is where information theory departs from realism, since reality
> provides memories and evidence of which Mona Lisa is new and which one was
> painted by Leonardo da Vinci at the beginning of the 16th century in
> Florence, Italy, Earth, Sol, Milky Way Galaxy*.*
>
> Mathematics can be said to allow for the possibility of novelt