Okay I grant you that the Deep Blue machine is part of the sediment buried 
under Moore's Law  -- had not looked at the benchmarks as closely as I should 
have it was late at night and I am going to stick with that answer :) 
 
As for the larger discussion I guess it boils down to my doubt about the 
theoretical possibility of a universal computer. Every computer that we know 
about executes within a defined context -- its execution context, and within a 
local frame of reference under which it is executing. The execution context is 
bounded and limited and does not and IMO cannot extend infinitely. Though I am 
pretty certain others may disagree and will argue that a universal computer can 
exist that executes in a universal all encompassing context. I do not see how 
this can be. The computer requires a substrate in which to operate upon -- the 
CPU chips for example are what our computers operate on. I know of no computer 
that does not require this external structured environment  -- necessary to 
exist outside of itself so to speak in order for it to be able to operate on 
this substrate. 
Every computer in existence requires external enabling hardware. 
 
If a computer requires a substrate which it can manipulate in order to perform 
its logical operations then a universal computer is impossible because the 
substrate would necessarily be outside and foundational to its domain.
 
In any non-universal computer we are back to the limits posed by execution 
context and local frame of reference. A process may be shown to be 
deterministic within some frame, within an execution context, but because -- I 
argue -- there can be no all encompassing universal execution context that does 
not itself rely on some external substrate to enable its basic operations -- 
there will always be other execution contexts and processes which are operating 
independently of any context. 
 
Now when different execution contexts begin communicating messages to each 
other how can a global outcome be said to be deterministic within the scope of 
any given execution context. Each single execution context is operating in its 
own frame of reference and will be generating outcomes based on its own frame. 
However its own frame is not completely isolated from other frames of reference 
in the larger linked meta systems -- say the internet as a single loosely 
coupled dynamic entity for example comprised of perhaps billions of connected 
devices each operating in its own local frame.
 
I find the idea that this massive meta entity of millions and millions of 
separate servers can be described as being deterministic in it's whole. The 
individual executing agents or processes -- that together when linked by the 
trillions of messages being sent back and forth comprise this larger meta 
entity --  can be modeled in a deterministic fashion within their individual 
frames of reference and execution contexts. 
 
But can one say the same thing about the larger meta entity that emerges from 
the subtle interactions of the many hundreds of millions of executing processes 
that dynamically impinge on it and through which it emerges?
 
When one speaks of outcomes, they often depend on subtle variables that are 
rapidly varying for example such that the results of running a function may 
change from instant to instant. While within the execution context of the 
function producing the result we can prove it is deterministic once this 
function is loosely linked to other separately running execution frames it 
becomes harder to deterministically predict any given outcome until some 
threshold of complexity and noise is reached where it becomes impossible to 
work back from the outcome and show how it has been determined.
 
Metaphorically I suppose you could imagine a pond and random pebbles being 
tossed into it from many various directions. At first it will be possible to 
analyze the ripples and their interference patterns and work back to the time 
and place of each pebble hitting the water event and determine the angle, size 
speed etc. of each pebble. But play this forward and keep throwing more and 
more pebbles onto the pond's surface from different angles and speeds. After 
some time can one work back to the first pebble and determine the specifics of 
that single event? Obviously in practice we cannot do so no matter how much 
computing power we throw at the problem because the interactions and 
interference patterns of the millions of ripples spreading out from different 
points will grow exponentially more difficult, until all the computers in the 
universe working together would be unable to solve the problem.... for a big 
enough pond that is of course.
 
Perhaps one could even invoke quantum erasure -- and state that once an event 
has become so interfered with by other events that all trace of it cannot be 
distinguished from the noise in the system then in a sense has not that event 
been erased? 
 
And yet the current state of the system in an infinitesimally miniscule way has 
been affected by it through an exceedingly long chain of events leading to 
other events and so forth.
Determinism depends on having a frame of reference and can only be defined 
within some frame of reference. I do not see how universal determinism can be 
demonstrated, perhaps I am wrong and it can be -- if so I would like to hear 
how it can be logically proved.
 
Cheers,
-Chris
 
  

________________________________
 From: John Clark <johnkcl...@gmail.com>
To: everything-list@googlegroups.com 
Sent: Friday, August 23, 2013 10:48 AM
Subject: Re: When will a computer pass the Turing Test?
  







On Thu, Aug 22, 2013 at 4:28 PM, Chris de Morsella <cdemorse...@yahoo.com> 
wrote:


>>> If it's not random then it happened for a reason, and things happen in a 
>>> computer for a reason too.
>
>> Sure, but the "reason" may not be amenable to being completely contained 
>> within the confines of a deterministic algorithm 

What on earth are you talking about? The deterministic algorithm behaves as it 
does for a reason but does not do so for a reason??!!


 
> if it depends on a series of outside processes 

If it depends on something then it's deterministic.

 


> > At the time it may have been a supercomputer but that was 16 years agoand 
> > the computer you're reading this E mail message on right now is almost 
> > certainly more powerful than the computer that beat the best human chess 
> > player in the world. And chess programs have gotten a lot better too. So 
> > all that spaghetti and complexity at the cellular level that you were 
> > rhapsodizing about didn't work as well as an antique computer running a 
> > ancient chess program.
>>

>
> You are incorrect even today Deep Blue is still quite powerful compared to a 
> PC 

Not unless your meaning of "powerful" is radically diferent from mine. 

  
> The Deep Blue machine specs:  
> It was a massively parallel, RS/6000 SP Thin P2SC-based system with 30 nodes, 
>with each node containing a 120 MHz P2SC microprocessor for a total of 30, 
>enhanced with 480 special purpose VLSI chess chips. Its chess playing program 
>was written in C and ran under the AIX operating system. It was capable of 
>evaluating 200 million positions per second, twice as fast as the 1996 
>version. In June 1997, Deep Blue was the 259th most powerful supercomputer 
>according to the TOP500 list, achieving 11.38 GFLOPS on the High-Performance 
>LINPACK benchmark.[12]
>

OK. 


> I doubt the machine you are writing your email on even comes close to 
that level of performance; I know mine does not achieve that level of 
performance.
>

Are you really quite sure of that? The computer I'm 
typing this on is an ancient iMac that was not top of the line even back a full 
Moore's Law generation ago when it was new, back in the olden 
bygone days of 2011. Like all computers the number of floating point 
operations per second it can perform depends on the problem, but in 
computing dot products running multi-threaded vector code it runs at 34.3 
GFOPS; so Deep Blue running at 11.38 GFLOPS doesn't seem as 
impressive as it did in 1997.

Right now the fastest supercomputer in the world has a LINPACK rating of 54.9 
pentaflops, a pentaflop IS A MILLION GFLOPS; so today that Chinese 
supercomputer is 4.8 
millions times as powerful as Deep Blue was in 1997. And in just a few 
years that supercomputer will join Deep Blue on the antique computer 
junk pile.


John K Clark


>


>
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to