The "all the code and configuration goes everywhere but which services are 
activated depends on identity" model is how many of us are building production 
systems.  Most large scale distributed systems I've worked on the past decade 
are not reconstructable from source. Rather the disk images are handed down 
from generation to generation, with bits getting swapped out by hand 
occasionally.  Development is the act of mutating a system and weeding out the 
versions that don't work. When you have thousands of servers scattered all over 
the globe, all chatting with each other, this behavior emerges naturally. 

Now the main mistake most people make when looking at the future of computing 
is that language, operating systems, etc, matter less than the scale of 
hardware. Once you have massive scale on computational fabric, your concerns 
cease to be the same. Time sharing goes away when everyone has thousands of 
cores. Message passing becomes the norm, but relativity becomes the new 
reality. 

When computation is a built in property of nano-scale structure of things, we 
will see a near total breakdown of current programming techniques. But at the 
same time fewer and fewer people with have the perspective to make it do 
anything useful.  Look at FPGAs and think of what you would do if every square 
inch of your home, clothes, and workplace were covered in them. Could you 
design software that would run on a truly massive scale?


_______________________________________________
fonc mailing list
[email protected]
http://vpri.org/mailman/listinfo/fonc

Reply via email to