of 1:10^6 a wall clock-day is worth 3 kiloyears simulation time.
10^ 9 generations per second? This rate depends(inversely) on the
10^9 generations/second is absurdly high. 10^9 rate is about the top of event rate in the simulation you could hope to achieve, given what we know of computational physics. Fitness testing seconds to minutes on very large populations looks very doable, though. Some complex behaviour can be evaluated in some 10-100 ms with massively parallel molecular hardware.
You guys are throwing around orders of magnitude like ping pong balls based on very little practical evidence. Sometimes no estimate is less misleading than one that is arbitrary.
No. The simulation handles virtual substrate, and that's O(1) if you match organism size with volume of dedicated hardware, assuming local signalling (which is ~ms constrained in biology, and ~ps..~fs constrained relativistically).
I was referring to the complexity of the organism's mind. Surely you are not going to tell me that as the evolving brains increase in complexity, there is no effect on the simulation speed?
People can be paid or volunteer to judge organism performance from interactive simulation. Co-evolution has a built-in drive and has no intrinsic fitness function but the naturally emergent one.
Yes Co-evolution is key.
But in order for interesting things to happen, organisms have to be able to interact with one another for quite some time before the grim reaper does his grim business.
What I'm interested in an efficient, robustly evolvable framework. It doesn't take more than insect equivalent complexity to achieve that. This implies full genetic determinism and simple fitness testing.
I'm confused, all you want are Ants? Or did you mean AGI in ant-bodies?
Are you seeing any specific physical limits in building systems hundreds of km^3 large? And why do you think you need systems of nontrivial size for evolutionary bootstrap of intelligence? Buckytronics are just molecules.
The idea of bootstrapping intelligence is interesting, but far from proven. That too will require much engineering.
No, I am seeing no constraints on the size of computers, apart from running out of matter. As I've stated before, I think you need a planet sized computer to perform a planet-sized competition to an interesting degree. But then, that's me playing fast and loose with the numbers.
So I'll retract it and just say that it is my belief that "evolving" AI's is feasible within a reasonable time frame because the fitness landscape is hopelessly large. We are better off Engineering them manually.
We'll have to agree to disagree on that because I can't continue to speculate usefully on what we can accomplish with planet sized molecular computers.
Constraints on biological tissue are very different from constraints of electron or electron spin distributions in solid state circuits switching in GHz to THz range.
While the overall architecture definitely contains lots of components necessary for hitting a fertile region in problem space, slavishly copying the microarchitecture is likely to only lead you astray.
I am referring to constraints on the complexity of information processing, abstracted from the biophysical implementation
And again... please do not put words into my mouth. I have stated explicitly that I'm not for "slavishly copying the microarchitechture", rather just using the brain as a lesson in what hard means.
-Brad
-------
To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
