Not a little misunderstanding of what optimizing compilers are good for, of
what they can and cannot do currently, has been evident in several different
IBM-MAIN threads/conversations during the last few days; and perhaps it will be
useful to try to explain where optimizing techniques stand today in not very
technical terms.
An example will help. Consider such a C statement as
target[j] = source[i] + 1;
Here typically a compiler must generate
o addressing code for element i of the array source,
o code that puts the value of source[i] into a register and adds 1 to it
o addressing code for element j of the array target, and
o code that stores this register-resident value of source[i] + 1 into the
location of target[j].
If now we modify this example slightly, making it
target[j] = target[j] + 1 ;
source and target are now the same; and it is clear that we need calculate only
the location of target[j]. In fact the C language makes syntactic provision
for this special case: in C it is possible and appropriate to write
target[j] += 1 ;
instead. This is a very simple optimization [circumvention of the need for an
optimization] , but it illustrates what optimization is about. An optimizing
compiler contains quite general machinery for recognizing interesting special
cases and then generating less general, more efficient code for them than it
would have generated if it had not recognized them.
A good first reference is:
F. J. Allen and John Cocke, "A catalog of optimizing transformations", Courant
Computer Science Symposium 5, Upper Saddle River, NJ: Prentice-Hall, 1977, pp.
1-30.
It is now more than 30 years old, but it has not been supplanted.
Optimization is enormously useful, but it does not deal in radical changes of
algorithms.
It will not replace linear search in a table with binary search in a tree,
overlap two independent i/o operations asynchronously instead of doing them one
after the other synchronously, or the like. It transforms locally poor
constructs into functionally equivalent better ones by recognizing special
cases. It does no root and branch replacements of algorithms by other,
globally better ones.
There has been speculation about such transformations, about a
radical-optimizing-transformation compiler or ROTC, but none such has yet left
the laboratory.
Vague appeals to optimization like those we have seen here recently are
generic. Confronted with some indefensible deficiency of a language, it has
long been usual for defenders of that language to resort to the possibility of
optimizing it away, to use optimization as a deus ex machina that will
magically render the indefensible innocuous.
To attack such uses of optimization is not to disparage it or minimize its
importance, but it is important to understand when one is being had, when
plausible but specious (and always suspiciously vague) language is being used
to paper over real difficulties.
Edward Jaffe has sensitized us all to the notion that advice to 'write a
macro' without specifying how that macro will do its work is vacuous advice.
The injunction 'optimize it [away]' is often misused in the same fashion.
John Gilmore Ashland, MA 01721-1817 USA
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html