There is a subtle difference between what I was proposing and Genetic 
Algorithms. With a Genetic Algorithm, there are a small number of authors that 
create it. In a sense, they are implicitly embedding their intelligence and 
understanding right into the code. They may not know what the best results will 
be, but by virtue of the construction they've constrained the search space. As 
time progresses, the algorithm should converge on an optimal solution.


If however, the system isn't authored by a small number of people, but rather a 
vast and essentially unlimited group, then the system isn't converging so much 
as it is just filling in all of the possibilities. Rather than the optimal 
solution to a fixed number of variables, the system becomes an expansive 
solution to an unlimited number of variables. That is, it grows beyond its 
boundaries (unevenly of course), and basically acts like a Brownian flood-fill 
of the knowledge space. 


Ultimately, for a given computation both systems might arrive at optimal or 
near-optimal behavior, but given that code is always constrained by its 
authors, consolidating a massive number of different works into some (sort of) 
coherent system is considerably more powerful. In a sense it's like comparing 
one highly well-written program to all of the code in the Internet. No doubt 
the program wins on elegance, but the Internet just covers so much more. If we 
could find a way to harness collective intelligence, but constrain all of that 
to work together nicely, then I think as the Internet has shown us, we can 
achieve wonderful things.

I wouldn't ignore the data/code duality, precisely because it shows up in some 
many interesting places. It dominates our view of the world as basically 
object/time (3D + a 4th D). It shows up in our languages as nouns/verbs. It 
even shows up in our our abstractions as static/dynamic. It shows up in so many 
places that I expect that it is really a fundamental physical property of our 
universe that just propagates outwards into our perceptions. And in most cases 
there are ways to migrate 'things' from one side to other (which no doubt has 
some deeper ramifications). 


Paul.  





>________________________________
> From: Kurt Stephens <[email protected]>
>To: [email protected] 
>Sent: Tuesday, October 30, 2012 2:03:23 PM
>Subject: Re: [fonc] How it is
> 
>On 10/3/12 9:53 AM, Paul Homer wrote:
>
>> If instead, programmers just built little pieces, and it was the
>> computer itself that was responsible for assembling it all together into
>> mega-systems, then we could reach scales that are unimaginable today. To
>> do this of course, the pieces would have to be tightly organized.
>Tightly organized != tightly coupled.
>
>> Contributors wouldn't have the freedom they do now, but that's a
>> necessity to move from what is essentially a competitive environment to
>> a cooperative one. Some very fixed rules become necessary.
>> 
>> One question that arises from this type of idea is whether or not it is
>> even possible for a computer to assemble a massive working system from
>> say, a billion little code fragments. 
>Genetic Algorithms already do this if one discards the notion of
>code/data duality.
>
>> Paul.
>-- KAS
>
>_______________________________________________
>fonc mailing list
>[email protected]
>http://vpri.org/mailman/listinfo/fonc
>
>
>
_______________________________________________
fonc mailing list
[email protected]
http://vpri.org/mailman/listinfo/fonc

Reply via email to