---------- Forwarded message ---------- Date: 11 Dec 98 10:35:18 From: Roberto Verzola <[EMAIL PROTECTED]> Reply-To: [EMAIL PROTECTED] To: [EMAIL PROTECTED], [EMAIL PROTECTED] Subject: [interdoc-y2k 17] Modularization vs globalization I suggested in my earlier message that the shift from a risk-minimizing strategy to a gain-maximizing strategy is one major flaw which led to the Y2K problem. A second flaw, I suggest, is the shift from a "modular" to a "globalist" approach, which converts a network of relatively independent but interconnected systems into a single tightly-coupled complex system. Below is an article on this topic which I had submitted earlier in the GKD/Y2K discussions but which was not posted. The article provides a theoretical argument for modularization, based on the experiences of systems designers. Roberto Verzola --------------------- Our Economic System: Badly Designed? by Roberto Verzola* The international financial crisis which struck Asian countries in 1997 and continues to cause widespread damage this year is a perfect example of what systems analysts call "the side-effects of global variables." Take the most complex systems ever designed -- like the Apollo spacecraft system which took men to the moon and brought them back, or computer chips that are made of tens of millions of components, or a complex operating system with one hundred million lines of code. They work as designed because the system designers followed certain rules of design which time and again have been proven correct. Follow the design rules, and you get a system that is robust and reliable. Violate the design rules, and you get a system that is unreliable and crash-prone. One of the most important rules that good designers will never violate is modularization: breaking up a complex system into relatively independent modules, which are isolated from each other except for a few well-defined interfaces. This design rule can be found in all engineering and computer science texts. It is true for hardware and software designs. Most complex systems that violated this rule ended as miserable failures, while those which tried to implement it showed much better rates of success. The reason for the rule is simple: as the number of components in a system increases, the number of possible interactions between components rises exponentially. Normally, all possible interactions must be checked for the possibility of unintended and undesirable results, called "side-effects." But beyond a certain number of components, it becomes impossible to double-check or even to trace the results of every possible interaction. Because these potentially undesirable side-effects increase at a faster rate than the number of components, they eventually bring the whole system crashing down. Designers had earlier argued against modularization because it was "inefficient." Modular designs tended to use more components; a lot of thought and effort had to go into the interfaces between modules; some level of redundancy was required among the modules. But the loss in efficiency was gained in reliability. Modular designs failed less often (the average time between failures is a standard measure of system reliability); and when they failed, errors were corrected faster. The history of systems design is replete with crashed spacecrafts and crashed computer operating systems that drove home the point: complex systems must be broken up into smaller, more managable, independent modules; otherwise, you get an unreliable, failure-prone, or unworkable design. The opposite of modularization is globalization. It is true: that favorite word of World Bank and IMF economists is an absolute no-no among systems designers. Open any respectable textbook on computer science or system design, and one of the first design rules you are going to meet is: avoid anything that affects the entire system globally. Break up large systems into smaller modules. Protect your modules from interference by other modules. Isolate your modules from each other. Hide information. Build firewalls. Most of all, avoid global variables. In an economic system, a global variable would be anything that can affect many portions of a large system. Global corporations, because they operate worldwide, would be a good example. The IMF, the World Bank, and the World Trade Organization (WTO), because they intrude into almost every economy in the world, would be another. Their moves and decisions affect many other economies in the world, and result in consequences and other interactions, that are so numerous that it becomes impossible to anticipate and correct for undesirable side-effects. These side-effect then proliferate; eventually, they can bring the whole system down. Unfortunately, most economists appear to have little understanding of system design. (When I was in college, many of those who failed our engineering subjects shifted to economics.) Instead of following good principles of design, our economists repeat the most common mistake of amateur programmers: they rely on global variables. Instead of building protective firewalls around our economy, they tear down existing walls of protection. Instead of strictly regulating those global variables that breach the walls that remain, they launch a perverse program of "deregulation," enlarging instead of restricting the impact of global variables. All those legal infrastructures which in the past protected us from the side-effects of global variables -- such as protective tariffs, foreign exchange controls, regulatory mechanisms and others which would have dampened the impact of global side-effects on our economy -- are being torn down. Instead of blocking IMF, WTO and World Bank interference, they kneel and bow before them. Instead of relying on local variables and local interactions, which are manageable locally, they put greater reliance on global markets and global players, touting this reliance as "sound economic fundamentals." It is interesting that neo-liberal economic theory conflicts with systems theory, though both of them claim to be a science. Real science, however, anticipates reality better than pseudo-science. Looking at the current global financial crisis, tt should be obvious which is which. Until we learn the basic lessons of systems design, and apply these to our own economy, we will be saddled with an unreliable, crash-prone economic system, one which will cause us endless suffering. There is another lesson we can learn from successful designs of the past. If a system is badly-designed, and suffers from too many global variables, any attempt at modification will likely produce even more unintended side-effects. Often, it is better to junk the misdesigned system altogether and to start again from scratch. Saddled with a system that embraces globalization and leaves us at the mercy of its side-effects, this is perhaps what we should also do. * Roberto Verzola is an engineer who specializes in computers. Funded by the Philippine government, he designed a computer system in 1981, the first Filipino to do so. He also designed the software for first online systems used at the Philippine Senate and House of Representatives in 1991. He is also an activist, and is the coordinator of Interdoc, a loose international network of NGOs tracking the social impacts of new information technologies.