On Fri, Jun 15, 2012 at 1:38 PM, Paul Homer <paul_ho...@yahoo.ca> wrote:

> there is some underlying complexity tied to the functionality that
> dictates that it could never be any less the X lines of code. The system
> encapsulates a significant amount of information, and stealing from Shannon
> slightly, it cannot be represented in any less bits.
>

A valid question might be: how much of this information should be
represented in code? How much should instead be heuristically captured by
generic machine learning techniques, indeterminate STM solvers, or
stability models? I can think of much functionality today for control
systems, configurations, UIs, etc. that would be better (more adaptive,
reactive, flexible) achieved through generic mechanisms.

Sure, there is a "minimum number of bits" to represent information in the
system, but code is a measure of human effort, not information in general.


>
> If things are expanding then they have to get more complex, they encompass
> more.
>

Complexity can be measured by number of possible states or configurations,
and in that sense things do get more complex as they scale. But they don't
need to become more *complicated*. The underlying structure can become
simpler, more uniform, especially compared to what we have today.

Regards,

David

-- 
bringing s-words to a pen fight
_______________________________________________
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc

Reply via email to