On Friday, January 31, 2014 4:09:38 PM UTC-5, Liz R wrote:
>
> On 1 February 2014 01:33, Craig Weinberg <whats...@gmail.com <javascript:>
> > wrote:
>
>> On Friday, January 31, 2014 2:15:55 AM UTC-5, Liz R wrote:
>>
>>> On 31 January 2014 17:13, Craig Weinberg <whats...@gmail.com> wrote:
>>>
>>>> On Thursday, January 30, 2014 10:32:02 PM UTC-5, Liz R wrote:
>>>>>
>>>>> It isn't *essential. *Technically, I believe I/O can be added to a 
>>>>> computer programme as some sort of initial settings (for any given run of 
>>>>> the programme). 
>>>>>
>>>>
>>>> Added how though? By inputting code, yes?
>>>>
>>>
>>> All code has to be input. That isn't input TO the programme, however, 
>>> it's setting up the programme before it is run. 
>>>
>>
>> Right, but that's my point. Computationalism overolooks its own 
>> instantiation through input. It begins assuming that code is running. It 
>> begins with the assumption that coding methods exist. I am saying that 
>> those methods can only be sensory-motive, and that sensory-motive phenomena 
>> must precede the first possible instance of computation.
>>
>
> I doubt J.A.W. would have accepted that as a valid crit of "It from Bit" 
> and I can't see that it's valid for comp either (or even Edgar's 
> whatever-the-hell-it-is). If brains compute, they presumably start by 
> boostrapping themselves, and only later get programmed by input from the 
> outside world.
>

When did the world become 'outside' though? If you bootstrap from 
immaterial Platonia that has no outside, how and why do numbers acquire 
non-numerical dimensionality?
 

> Likewise one can imagine a self-assembling computer. This is simply 
> *incidental* to how humans get computation done - like I/O, it isn't 
> ontologically fundamental.
>

I think that it is meta-ontologically fundamental. Comp just ignores the 
question of I/O because it is too superficial of a treatment of reality to 
examine it.
 

>   
>>>>
>>>>> Obviously this isn't much use in practice, of course! But from a 
>>>>> philosophical perspective it's possible, so it isn't ontologically 
>>>>> essential to the function of computation.
>>>>>
>>>>> A trivial example would be my son's Python programme to generate 2000 
>>>>> digits of pi. It just uses some existing equation which generates each 
>>>>> digit in sequence. It happens to write the output to the screen, but if 
>>>>> he 
>>>>> took out the relevant PRINT statement, it wouldn't - but it would still 
>>>>> compute the result.
>>>>>
>>>>
>>>> The existing equation was input at some point though, and without the 
>>>> output, whether or not there was a computation is academic (and 
>>>> unfalsifiable). 
>>>>
>>>
>>> That wasn't the point. The question was whether I/O is ontologically 
>>> essential to the function of computation. Quite clearly, the answer is no. 
>>> The function of computation *can* exist without any I/O, so that 
>>> answers the question.
>>>
>>
>> I disagree. I don't think that we know that. There is no possible case 
>> where computation without output is observed, so we cannot assume that 
>> computation is ontologically possible without output. We cannot assume that 
>> theoretical computation is free from the ontological constraints that real 
>> computation is subject to in our experience.
>>
>
> Computation without any output can be observed by examining the machinery 
> involved, if necessary. But I bet you'll just redefine output to mean 
> whatever the hell you want it to, just as you got around an honest attempt 
> to show a flaw in your argument with a ridiculous comment about computation 
> being academic without any output, as though a programme that hangs in an 
> infinite loop without producing output is somehow not computing,
>

It's not that it isn't computing, it is that it is impossible for it to 
matter whether it is computing or not. Computing is irrelevant to us 
without i/o, so why should we expect that it is any more relevant to 
itself? I missed the honest attempt to show a flaw in my argument though - 
which flaw is that?
 

> as is a programme that runs in the background - the "magic" is supposed to 
> happen at the moment of output? 
>

It's not magic, it's sensory experience. That which makes anything matter.
 

> A programme that runs for 100 days factoring a huge number "didn't do 
> anything" even though it racked up a massive power bill and used 99% of the 
> CPU time and 95% of the memory if the plug gets pulled just before it gives 
> its output? Sorry, but this is just nonsense.
>

It's doing something, but what it is doing is completely worthless. There 
is no functional difference between what it is doing and just spinning hard 
drives.
 

> I gave the answer to your question. The answer was no. If that doesn't fit 
> with some theory, redesign the theory, don't go into an Edgar-spiral of 
> hand-waving and spouting nonsense.
>

Your objections were already factored in before I asked the question. 
Obviously computer science does not consider i/o to be ontologically 
necessary, but I am proposing that is because computer science is 
theoretical and exists within a toy model of itself, rather than a thorough 
account of what is ontologically required realistically for computation to 
arise. 

 
>>> I was just answering your question honestly and as accurately as I 
>>> could. If you're going to change the question to something else when I 
>>> attempt to answer it, I won't bother in future.
>>>
>>
>> You're answering it honestly, but you are assuming a universe in which 
>> sensory experience is theoretical and computation is actual. I am pointing 
>> out that this is a theoretical perspective. 
>>  
>> I'm answering it within the bounds of the everyday experience we have 
> with computers. I don't say sensory experience is theoretical, I just 
> assume the standard model of how things work.
>

It is usually a mistake to assume the standard model of how things work, 
IMO. 
 

> If you are going to make some weird ontological assumptions I would 
> appreciate it if you stated them up front and kept reminding me that this 
> is the basis you're working on. 
>

I assume nothing except what I have no choice to assume from my own 
experience.
 

> Otherwise I assume the default assumptions for the field in question, 
> which in this case is computation. I gave an honest answer on that basis, 
> but since it showed the answer was one you didn't like, you immediately 
> moved the goalposts.
>

It's not that I don't like it, it is the one I expected. I'm proposing that 
it is an incomplete account of reality which makes computationalism seem 
more plausible than it will ever be.
 

>
> To be honest, although I think you were asking a genuine question, that is 
> exactly what trolls do.
>

I don't understand trolling. Seems like a waste of time.


-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to